uber,-lyft-drivers-aren’t-employees-after-all,-california-voters-say

Uber, Lyft drivers aren’t employees after all, California voters say

California voters approved Prop. 22, which would exempt companies such as Uber and Lyft from having to classify their workers as employees, according to The Associated Press. The $200 million campaign in support of the measure was the most expensive in state history.

The ballot measure mandates that drivers for Uber, Lyft and DoorDash will receive new benefits, such as minimum hourly earnings. But drivers won’t get the full protections and benefits that come with employment, as they may have had to under another law, AB5 — which originally took aim at gig work. Labor groups, which opposed the law, raised only a tenth as much money.

Uber and Lyft threatened to leave California — or drastically cut back service — if they were forced to classify drivers as employees. Uber’s CEO, Dara Khosrowshahi, predicted a sharp increase in fare prices and fewer drivers on the platform if Prop 22 failed. If Uber had to hire its drivers, he said, it would only have room for 280,000 workers instead of the 1.4 million who currently use the app. The company declared victory tonight, thanking the Prop 22 supporters.

The fight over gig economy workers has been playing out in California for well over a year, but it intensified this summer when Uber and Lyft were ordered by a California Superior Court judge to immediately classify their drivers as employees. The ruling was in response to a preliminary injunction filed by California Attorney General Xavier Becerra as part of a lawsuit alleging the companies are in violation of the state’s AB5 law that went into effect on January 1st. The law enshrines the so-called “ABC test” to determine if someone is a contractor or an employee, and generally makes it more difficult for companies like Uber and Lyft to classify workers as independent contractors.

Uber and Lyft say most drivers prefer to be independent because of the flexibility and ability to set their own hours. But this status also forces drivers to shoulder all the costs of their work, while depriving them of traditional employee benefits like paid sick leave, health insurance, and worker’s compensation.

waymo-moved-its-self-driving-cars-in-san-francisco-to-a-‘secured-location’-in-case-of-election-chaos

Waymo moved its self-driving cars in San Francisco to a ‘secured location’ in case of election chaos

Waymo is pulling its autonomous vehicles out of San Francisco in anticipation of Election Day unrest, The Verge has learned.

The Google spinoff is “temporarily pausing” its AV test operations in San Francisco on Tuesday and Wednesday and moving its fleet to Mountain View, where it will be parked in a “secured location,” according to an email from Transdev, Waymo’s fleet operations vendor.

The decision was made “out of an abundance of caution ahead of some of the planned protests around the general election,” Chris Cheung, general manager at Transdev North America, wrote in the email obtained by The Verge.

Two Waymo safety drivers told The Verge they got the word midday on Monday to manually drive their autonomous vehicles from San Francisco to Mountain View that afternoon. They then had to take Uber or Lyft rides back to their base in the city to retrieve their personal vehicles. A Waymo spokesperson said drivers will be reimbursed for those rides.

Safety drivers based in San Francisco will be paid while operations are suspended, Cheung said in the email. And Waymo’s fleet that is based in Mountain View will continue to test on public roads. “Your safety is our number one priority,” she added, “and we will continue to monitor the situation closely.”

Echoing Cheung’s email, the spokesperson for Waymo said, “Out of an abundance of caution and with the safety of our team in mind, we are temporarily suspending driving operations in San Francisco on 11/3 and 11/4.”

Waymo isn’t alone in its expectation of Election Day chaos. Businesses across the US are boarding up windows and stepping up security measures in anticipation of protests and possible looting. This was true in the Bay Area, where KRON4 reported that plywood was being used to cover up windows at the Westin St. Francis hotel off of Union Square. The results of the election are not expected to be finalized on November 3rd and may not be known for days to come.

The election will take place against the backdrop of a global pandemic and nationwide protests against the police killing of unarmed Black men and women. In May, protests in the Bay Area resulted in police firing tear gas and injuring demonstrators with rubber bullets. Hundreds of protestors marched peacefully through downtown San Francisco last Sunday in advance of the general election.

This isn’t the first time Waymo grounded its fleet of about 600 vehicles, more than half of which are based in Arizona. The company temporarily paused its vehicle testing operation back in March at the onset of the coronavirus pandemic. Some safety drivers had complained that the company was slow to respond to the health crisis, but Waymo insisted that it acted appropriately.

The company resumed testing in the Bay Area at the end of May, even as COVID-19 cases were increasing in California and across the country. Safety drivers expressed concern that the company continued to operate during widespread wildfires on the West Coast over the summer. Waymo eventually halted testing for a day in early September when the air quality registered as “very unhealthy.”

Update November 3rd, 11:46AM ET: Transdev General Manager Chris Cheung is a woman. A previous version of this story used the wrong pronoun. We regret the error.

database:-timescaledb-2.0-records-and-analyzes-large-time-series

Database: TimescaleDB 2.0 records and analyzes large time series

TimescaleDB, a scalable database specializing in time series, is now available after two years – with several beta versions and a first release candidate – the second release candidate (RC). The time series database supports SQL, is based on PostgreSQL as an extension and, in contrast to relational databases, is freely scalable.

It offers a distributed multi -Node architecture which, according to the editors, can store data from time series in the size range of petabytes and process it particularly quickly. For self-managed installations of the software, the RC is ready for production, the final release and roll-out in the services managed by Timescale is planned for the end of the year. The database is available as open source free of charge.

Continuous aggregation via updated APIs According to the announcement in the Timescale blog, the release candidate also provides all enterprise features free of charge and grants users more rights than before. Continuous aggregation of data should be possible via updated APIs and give users more control over the aggregation process. Recently, tasks can apparently be customized and within the database it should be possible to control the individual tasks and behavior during execution more precisely with a schedule. In terms of speed of data processing, the publishers refer to rankings of the relational databases available on the market, in which PostgreSQL is currently in the top 4 and is apparently on a par with MongoDB or is slightly overtaken.

Why a separate database for time series? The database is not about peanuts, but about particularly large data sets, which are often distributed over several servers and nodes the collection of telemetric data are received on an ongoing basis and, for example in the case of financial service providers or scientific projects, can include over a billion data series daily. Production lines in factories, smart home devices, vehicles, the stock market, software stacks, but also private devices, for example in the health sector, continuously produce telemetric data via apps, the classification criterion of which is the time series.

Since the volume of such data series is growing and relational databases are apparently reaching their limits in terms of collection and processing, the Timescale creators launched the project of a specialized database three and a half years ago. Companies as diverse as Bosch, Siemens, Credit Suisse, IBM, Samsung, Walmart, Uber, Microsoft and Warner Music use and support the development, according to the provider. According to the Timescale blog, in addition to the PostgreSQL community and its ecosystem, there is also a developer community that is specifically interested in time series behind the project.

Further information A review of the first release candidate shows that at that time a number of services were still chargeable and that the range has apparently increased: According to the provider, the number of users has increased from back then (2018) one million downloads increased to ten million today. Further information on the second release candidate can be found in the Timescale blog. The blog lists a number of demo videos and offers several download options; the software is available as a Docker image and in other variants. For users who are familiar with TimescaleDB, the changelog should be relevant, the Timescale team has put together an update guide for updating.

(sih )

waymo-pulls-back-the-curtain-on-6.1-million-miles-of-self-driving-car-data-in-phoenix

Waymo pulls back the curtain on 6.1 million miles of self-driving car data in Phoenix

Photo by Vjeran Pavic / The Verge

Over 21 months in Arizona, Waymo’s vehicles were involved in 47 collisions and near-misses, none of which resulted in injuries

In its first report on its autonomous vehicle operations in Phoenix, Arizona, Waymo said that it was involved in 18 crashes and 29 near-miss collisions during 2019 and the first nine months of 2020.

These crashes included rear-enders, vehicle swipes, and even one incident when a Waymo vehicle was T-boned at an intersection by another car at nearly 40 mph. The company said that no one was seriously injured and “nearly all” of the collisions were the fault of the other driver.

The report is the deepest dive yet into the real-life operations of the world’s leading autonomous vehicle company, which recently began offering rides in its fully driverless vehicles to the general public. Autonomous vehicle (AV) companies can be a black box, with most firms keeping a tight lid on measurable metrics and only demonstrating their technology to the public under the most controlled settings.

Indeed, Waymo, which was spun out of Google in 2016, mostly communicates about its self-driving program through glossy press releases or blog posts that reveal scant data about the actual nuts and bolts of autonomous driving. But in this paper, and another also published today, the company is showing its work. Waymo says its intention is to build public trust in automated vehicle technology, but these papers also serve as a challenge to other AV competitors.

“This is a major milestone, we think, in transparency,” said Matthew Schwall, head of field safety at Waymo, in a briefing with reporters Wednesday. Waymo claims this is the first time that any autonomous vehicle company has released a detailed overview of its safety methodologies, including vehicle crash data, when not required by a government entity. “Our goal here is to kickstart a renewed industry dialogue in terms of how safety is assessed for these technologies,” Schwall said.

The two papers take different approaches. The first outlines a multilayered approach that maps out Waymo’s approach to safety. It includes three layers:

  • Hardware, including the vehicle itself, the sensor suite, the steering and braking system, and the computing platform;
  • The automated driving system behavioral layer, such as avoiding collisions with other cars, successfully completing fully autonomous rides, and adhering to the rules of the road;
  • Operations, like fleet operations, risk management, and a field safety program to resolve potential safety issues.

The second paper is meatier, with detailed information on the company’s self-driving operations in Phoenix, including the number of miles driven and the number of “contact events” Waymo’s vehicles have had with other road users. This is the first time that Waymo has ever publicly disclosed mileage and crash data from its autonomous vehicle testing operation in Phoenix.

The public road testing data covers Waymo’s self-driving operations in Phoenix from January 2019 through September 2020. The company has approximately 600 vehicles as part of its fleet. More than 300 vehicles operate in an approximately 100-square-mile service area that includes the towns of Chandler, Gilbert, Mesa, and Tempe — though its fully driverless cars are restricted to an area that is only half that size. (Waymo hasn’t disclosed how many of its vehicles operate without safety drivers.)

Between January and December 2019, Waymo’s vehicles with trained safety drivers drove 6.1 million miles. In addition, from January 2019 through September 2020, its fully driverless vehicles drove 65,000 miles. Taken together, the company says this represents “over 500 years of driving for the average licensed US driver,” citing a 2017 survey of travel trends by the Federal Highway Administration.

Waymo says its vehicles were involved in 47 “contact events” with other road users, including other vehicles, pedestrians, and cyclists. Eighteen of these events occurred in real life, while 29 were in simulation. “Nearly all” of these collisions were the fault of a human driver or pedestrian, Waymo says, and none resulted in any “severe or life-threatening injuries.”

The company says it also counts events in which its trained safety drivers assume control of the vehicle to avoid a collision. Waymo’s engineers then simulate what would have happened had the driver not disengaged the vehicle’s self-driving system to generate a counterfactual, or “what if,” scenario. The company uses these events to examine how the vehicle would have reacted and then uses that data to improve its self-driving software. Ultimately, these counterfactual simulations can be “significantly more realistic” than simulated events that are generated “synthetically,” Waymo says.

This use of these simulated scenarios sets Waymo apart from other AV operators, said Daniel McGehee, director of the National Advanced Driving Simulator Laboratories at the University of Iowa. That’s because it allows Waymo to go deeper on a variety of issues that may contribute to a crash, such as sensor reliability or the interpretation of particular images by the vehicle’s perception software “They’re really going beyond regular data,” McGehee said in an interview. “And that’s very new and very unique.”

Waymo says the majority of its collisions were extremely minor and at low speeds. But the company highlighted eight incidents that it considered “most severe or potentially severe.” Three of these crashes occurred in real life and five only in simulation. Airbags were deployed in all eight incidents.

In the paper, Waymo outlines how “road rule violations” of other drivers contributed to each of the eight “severe” collisions:



  • Waymo



  • Waymo



  • Waymo



  • Waymo



  • Waymo



  • Waymo



  • Waymo



  • Waymo

The most common type of crash involving Waymo’s vehicles was rear-end collisions. Waymo said it was involved in 14 actual and two simulated fender-benders, and in all but one, the other vehicle was the one doing the rear-ending.

The one incident where Waymo rear-ended another vehicle was in simulation: the company determined that the AV would have rear-ended another car that swerved in front of it and then braked hard despite a lack of obstruction ahead — which the company says was “consistent with antagonistic motive.” (There have been dozens of reports of Waymo’s autonomous vehicles being harassed by other drivers, including attempts to run them off the road.) The speed of impact, had it occurred in real life, would have been 1 mph, Waymo says.

Waymo’s vehicles often drive hyper-cautiously or in ways that can frustrate a human driver — which can lead to fender-benders. But Waymo says its vehicles aren’t rear-ended more frequently than the average driver. “We don’t like getting rear ended,” Schwall said. “And we’re always looking for ways to get rear ended less.”

The only crash involving a fully driverless Waymo vehicle, without a safety driver behind the wheel, was also a rear-ending. The Waymo vehicle was slowing to stop at a red light when it was rear-ended by another vehicle traveling at 28 mph. Airbags deployed in both vehicles, even though there was no driver behind the Waymo’s steering wheel to benefit from it.

Just one crash took place with a passenger in a Waymo vehicle, in the Uber-like Waymo One ride-hailing service that’s been operating since 2018. By early 2020, Waymo One was doing 1,000 to 2,000 rides every week. Most of these rides had safety drivers, though 5 percent to 10 percent were fully driverless vehicles. The crash occurred when a Waymo vehicle with a safety driver behind the wheel was rear-ended by a vehicle traveling around 4 mph. No injuries were reported.

Waymo was also involved in 14 simulated crashes in which two vehicles collided at an intersection or while turning. There was also one actual collision. These types of crashes, called “angled” collisions, are important because they account for over a quarter of all vehicle collisions in the US, and nearly a quarter of all vehicle fatalities, Waymo says. The one actual, non-simulated angled collision occurred when a vehicle ran a red light at 36 mph, smashing into the side of a Waymo vehicle that was traveling through the intersection at 38 mph.

Fortunately, the “most severe” collision only took place in simulation. The Waymo vehicle was traveling at 41 mph when another vehicle suddenly crossed in front of it. In real life, the safety driver took control, braking in time to avoid a collision; in the simulation, Waymo’s self-driving system didn’t brake in time to prevent the crash. Waymo determined it could have reduced its speed to 29 mph before colliding with the other vehicle. The company says the crash “approaches the boundary” between two classifications of severe collisions that could have resulted in critical injuries.

Self-driving car safety has drawn additional scrutiny after the first fatal crash in March 2018, when an Uber vehicle struck and killed a pedestrian in Tempe, Arizona. At the time, Waymo CEO John Krafcik said his company’s vehicles would have avoided that fatal collision.

The vast majority of cars on the road today are controlled by humans, many of whom are terrible drivers — which means Waymo’s vehicles will continue to be involved in many more crashes. “The frequency of challenging events that were induced by incautious behaviors of other drivers serves as a clear reminder of the challenges in collision avoidance so long as AVs share roadways with human drivers,” Waymo says at the conclusion of its paper. AVs are expected to share the road with human drivers for decades to come, even under the rosiest predictions about the technology.

There’s no standard approach for evaluating AV safety. A recent study by RAND concluded that in the absence of a framework, customers are most likely to trust the government — even though US regulators appear content to let the private sector dictate what’s safe. In this vacuum, Waymo hopes that by publishing this data, policymakers, researchers, and even other companies may begin to take on the task of developing a universal framework.

To be sure, there is currently no federal rule requiring AV companies to submit information about their testing activities to the government. Instead, a patchwork of state-by-state regulations governs what is and isn’t disclosed. California has the most stringent rules, requiring companies to obtain a license for different types of testing, disclose vehicle crashes, list the number of miles driven, and the frequency at which human safety drivers were forced to take control of their autonomous vehicles (also known as a “disengagement”). Unsurprisingly, AV companies hate California’s requirements.

What Waymo has provided with these two papers is just a snapshot of a decade worth of public road testing of autonomous vehicles — but a very important one nonetheless. Many of Waymo’s competitors, including Argo, Aurora, Cruise, Zoox, Nuro, and many others, publish blog posts detailing their approach to safety, submit data to California as part of the state’s AV testing program, but not much else beyond that. With these publications, Waymo is laying down the gauntlet for the rest of the AV industry, the University of Iowa’s McGehee said.

“I think it will go a long way to force other automated driving companies to reveal these kinds of data moving forward,” he said, “so when things go wrong, they provide a framework of data that is available to the public.”

Not all companies are proceeding with as much caution as Waymo. Tesla CEO Elon Musk recently called Waymo’s approach to autonomous driving “impressive, but a highly specialized solution.” Last week, his company released a beta software update called “Full Self-Driving” to a select group of customers. Musk claimed it was capable of “zero intervention drives,” but within hours of the release, videos surfaced of Tesla customers swerving to avoid parked cars and other near misses.

Years ago, Waymo considered developing an advanced driver-assist system like Tesla’s “Full Self-Driving” version of Autopilot but ultimately decided against it having become “alarmed” by the negative effects on the driver, Waymo’s director of systems engineering Nick Webb said. Drivers would zone out or fall asleep at the wheel. The experiment in driver assistance helped solidify Waymo’s mission: fully autonomous or bust.

“We felt that Level 4 autonomy is the best opportunity to improve road safety,” Webb added. “And so we’ve committed to that fully.”

evil-makes-sense-of-a-messy-world

Evil makes sense of a messy world

Earlier this month, CBS’s Evil dropped its first season on Netflix. It arrived after what had felt for me like a listless few months; very little pop culture could hold my attention. And then out of nowhere I was transfixed.

Evil is a show that surprises you, which to me makes it one of last year’s best dramas. While the show is essentially a network procedural — perhaps the least surprising genre of television — the series is interested in stretching the boundaries of what that means, starting with its premise. Evil follows Dr. Kristen Bouchard (Katja Herbers), a forensic psychologist, and David Acosta (Mike Colter), a priest in training. Together, the two work as assessors for the Catholic church, investigating claims of the supernatural in order to determine if the church should get involved, usually for an exorcism.

Their dynamic is at least two kinds of broadcast staple: a will they / won’t they and believer / skeptic, a pairing at least as old as The X-Files. Their occupation, however, is unusual — and, as far as I can tell, an extremely liberal interpretation of what real-life Catholic assessors do, which according to a quick online search, seems to be something more like an ecclesiastical paralegal. And the job immediately provides fuel for interesting twists on shopworn stories. Like in the pilot, where Acosta must determine whether a serial killer is in fact possessed by a demon named Roy. A demon that then seems to haunt the skeptical Bouchard.

In Evil, there is usually an explanation for the supernatural, but the show always leaves just enough room for doubt to creep in: sometimes it’s an image no one can explain, a culprit no one ever sees again, or a smoking gun that makes no sense. In a genre largely concerned with wrapping everything up in an hour, Evil rejects closure. The only thing it believes, definitively, is that things are getting terrible in a way that they really haven’t before.

“The world is getting worse,” David Acosta tells Bouchard toward the end of the pilot episode, “because evil is no longer isolated. Bad people are talking to one another.”

I’ve been thinking about that line nonstop since I heard it. Bad people are talking to one another. It feels too neat and reductive to be completely accurate, and yet I feel its truth every time I see a pundit parrot white supremacist talking points or run-of-the-mill disinformation that’s come from the president of the United States himself. These conversations are happening every day. So, yeah. Evil seems stronger than before, and technology is good at helping it.

It’s the inverse of a lot of mass messaging about technology, which still trends toward vapid boosterism: Facebook connects us, Uber takes you places, GoFundMe helps you raise money to do things you believe in. This cheery facade was always rotten, red meat for investors propped up by a trembling skeleton of venture capital, but now it is absolutely putrescent. Facebook empowers dictators. Uber and Lyft lobby for legislation that will deny gig workers the employment status that would provide them with things as basic as minimum wage and paid overtime. GoFundMe is a testament to our failed health care system, where only people who are lucky enough to go viral can raise the money to pay off lifesaving care.

Social good is great branding, but technology is always an accelerant. You don’t see the evil until it’s too late, when the products are entrenched in the marketplace and the status quo has reasserted itself in devious new ways.

In Evil, technology is crucial: Bouchard and Acosta investigate their horror stories with the help of tech specialist Ben Shakir (Aasif Mandvi), who is usually integral to proving there’s a rational explanation to some supernatural occurrence. In one fun episode, he has to determine if a smart speaker is haunted. In another, a hacker breaks into a VR game to trick children into thinking a ghost controls their headsets. In Evil’s universe, technology functions as an explanation, and never as a tool of outright subjugation. If there is anything truly supernatural, it’s outside of the digital realm.

After some shoe-leather investigating, it usually turns out Evil’s occult concerns are — like any moral panic — a smoke screen for more everyday horrors. Like in a later episode, when a disillusioned young man is rejected by a woman he’s attracted to and is encoura

fighting-workplace-harassment-is-going-to-take-more-than-a-hotline

Fighting workplace harassment is going to take more than a hotline

Two female entrepreneurs are finding new ways to help employees tell their stories

Say you’re a female employee working for a small tech company. You’re sitting in a meeting with your boss, the CEO. He’s — hypothetically — a 38-year-old white man with $1.2 million in the bank. You have student debt, a sick cat, whatever. Basically, you need this job. But your boss, he won’t stop hitting on you. Slack messages asking you to drinks, photos of the views from his weekend bike rides. Then, during the Wednesday morning meeting, he takes things even further. Puts a hand on your knee, says he wants to get to know you better.

What do you do?

You could go to HR, try to work things out internally. Maybe your company has a dedicated HR specialist, rather than an overworked recruiter who’s been given HR responsibilities and doubles as the office manager. And what if that doesn’t work? What if there’s nothing HR can do?

Historically, the options were to quit or stay quiet. But a female entrepreneur in the UK is envisioning another way. Neta Meidav is the founder of Vault, an app that allows employees to document misconduct in real time. The idea came to her while sitting at her kitchen table, watching the Harvey Weinstein story unfold on her TV. It reminded her of an incident that had happened at her first job out of university, when her boss came on to her during a meeting. “I never even thought of reporting it,” she says. “I was terrified that my career would be crushed by a powerful man before it even started.”

Meidav realized a lot of the reasons she hadn’t spoken up had to do with fear. She’d been worried she wouldn’t be believed — but what if documenting the incident had been easy? She didn’t want to be the first one to report the boss who’d harassed her. But what if she’d known there were others?

The problem isn’t limited to the tech industry. Reporting harassment in any office is intimidating, particularly when the tools available to employees — like harassment hotlines — are geared more toward compliance than worker safety.

To really solve the problem, companies need to invest in resources that help build trust. This means dedicated HR teams and tools that make reporting mismanagement easy. If they don’t, employees are more likely to go public — speaking to reporters or, in true 2020 form, blowing the situation up on Twitter.

On Vault, employees can write up reports about workplace misconduct as they happen, either describing incidents or screenshotting digital interactions. These reports go into a timestamped ledger. Employees can submit them right away or wait until another worker has come forward with a similar claim. Vault doesn’t reveal the identity of employees to one another, just the assurance that someone else has spoken up.

Vault allows employees to document harassment in real time

Employees also have the option of reporting workplace harassment anonymously. While most companies are hard-pressed to follow up on complaints without knowing a worker’s identity, Vault allows HR teams to chat with employees anonymously on the app, gathering more information without needing to know their name.

Not every company is willing to invest in a reporting tool like Vault. For those that aren’t, more employees are choosing to go to the media with their stories. But knowing who to speak to can be tricky.

That’s where Ariella Steinhorn comes in. The founder of Lioness Strategies, Steinhorn is part of a new wave of PR that’s focused on helping employees get the word out about their stories. She thought of the idea after a former boss came onto her on Slack — a situation she realized was likely happening to other women in the workplace.

Workers can submit claims to Lioness using an encrypted email address. After the firm vets the stories, it matches workers with reporters, using an unofficial network of journalists at national publications.

Steinhorn works with employment lawyers to review nondisclosure agreements that could make it difficult for employees to go on the record. She comes from a big tech background, having previously worked in communications at Uber, and knows what people are up against when they decide to speak out. Amber Scorah, Lioness’ director of strategy, grew up as a Jehovah’s Witness and has written a book about her decision to leave a high-control religion — another helpful background when working with certain large companies.

Lioness founder Ariella Steinhorn

It’s not an accident that both Vault and Lioness were started by women who experienced harassment in the workplace, given how often these incidents tend to occur. Today, when many of the problems roiling the tech industry were started by high-profile men, it seems right that the solutions could come from women.

waymo-and-daimler:-here-are-the-fully-self-driving-trucks

Waymo and Daimler: here are the fully self-driving trucks

Waymo and Daimler, the parent company of Mercedes-Benz, have announced a new global partnership for the supply of trucks that do not require the driver

of Rosario Grasso published , at 09: 41 in the Technology channel
Waymo Daimler

Daimler will integrate autonomous driving technology of Waymo , considered the pinnacle of scientific research in the sector, in its fleet of heavy semi-trailers Freightliner Cascadia . It is one of the fruits of the new agreement between the two companies that in the long term will attempt to establish a strategic and global partnership for the refinement of fully autonomous driving technology for trucks. Here the press release.

Completely autonomous driving between successes and setbacks

The two companies have so far worked independently on these technologies. Specifically, Waymo bought a small fleet of Peterbilt trucks , which he updated with sensors and autonomous driving software. It is currently conducting trials on these trucks in Arizona, New Mexico and Texas.

For its part, Daimler announced in 2015 that it had started working on autonomous driving, when he showed a working prototype called Freightliner Inspiration Truck . After a series of demonstrations, he finally showed the production version of this truck at CES in Las Vegas on 2019.

The development of fully autonomous driving technologies is therefore proceeding at full speed, even if there is no shortage of setbacks. Uber will face legal proceedings following the fatal accident caused by one of its self-driving vehicles.

“Daimler Trucks is developing a custom Cascadia Freightliner truck chassis with technology systems supplied by Waymo, with the aim of setting the industry standard in reliability and safety. This chassis will enable integration of Waymo Driver with its customized and scalable combination of hardware, software and computing “ reads an official communication issued by Daimler.

The final objective is to put into circulation trucks of Level 4 according to definition of the Society of Automotive Engineers ?? (SAE) or, limited to a specific area, capable of driving without a human driver behind the wheel . Waymo already has some Level 4 vehicles in operation, precisely in the Phoenix area, Arizona.

On the other hand, Waymo, which as known is a subsidiary of Google , has agreements of this nature with several other car manufacturers. Before Daimler, in fact, he initiated strategic synergy relationships with Nissan-Renault, Fiat Chrysler, Jaguar Land Rover and Volvo . Daimler’s trucking division also has its own self-driving subsidiary, Torc Robotics .

In this regard, the reading on the evolutionary approach to the coach may be of interest