‘No one was in the driver seat’ during Tesla crash that killed two

Authorities in Spring, Texas, say they’re 100% certain that no one was driving during a fatal Tesla crash on Saturday evening. According to KHOU-11 , the 2019 Tesla Model S had two passengers inside, one aged 59 and the other aged 69, when the car went off the road at a slight curve and crashed into a tree, bursting into flames. Harris County Precinct 4 Constable Mark Herman said the fire took four hours and more than 30,000 gallons of water to extinguish, as the car’s batteries continued to reignite the flames. Two men dead after fiery crash in Tesla Model S. “[Investigators] are 100-percent certain that no one was in the driver seat driving that vehicle at the time of impact,” Harris County Precinct 4 Constable Mark Herman said. “They are positive.” #KHOU11 https://t.co/q57qfIXT4f pic.twitter.com/eQMwpSMLt2 — Matt Dougherty (@MattKHOU) April 18, 2021 While authorities haven’t confirmed whether Tesla’s Autopilot feature was engaged at the time of the crash, Herman said it was “almost impossible” for anyone to have been in the driver seat at the time. Tesla did not immediately respond to a request for comment. Read More …

Gen. Charles Q. Brown Jr., America’s first Black Air Force chief, on race, tech, and the trouble with AI

General Charles Q. Brown Jr. became the first Black chief of staff of the Air Force during a perilous moment for the United States. In the time between Brown’s nomination and his unanimous confirmation by the Senate, George Floyd died under the knee of officer Derek Chauvin on the street in Minneapolis. While angry protests and a national reckoning over race unfolded around the country, Brown made the difficult decision to speak out with unusual frankness and depth of feeling for a military leader. “I’m thinking about how my nomination provides some hope but also comes with a heavy burden,” he said in a video addressed to Air Force personnel. “I can’t fix centuries of racism in our country, nor can I fix decades of discrimination that may have impacted members of our Air Force.” [Photo: U.S. Air Force] Brown also entered his role as the U.S. was navigating a rapidly evolving global threat environment. The four-star general spent a good part of his career leading the Air Force’s fight against nonstate terror groups, chiefly ISIS, in Iraq and Afghanistan. But now the U.S. is increasingly threatened by major state actors, mainly a resurgent Russia and emergent China. These new opponents may attack in ways that aren’t necessarily addressable using fighter planes and missiles. It’ll be Brown’s job to oversee the Air Force’s shift in investment away from legacy platforms and toward technologies that will allow the U.S. to compete in the battle theaters of the future. I spoke to the general about these emerging threats, the Air Force’s work with U.S Read More …

This immersive technology turns hospitals into less stressful places

There is quality sound, and there is noise. Sadly, in our day-to-day lives, we have way too much of the latter. Excessive noise can cause several short- and long-term health problems, such as sleep disturbance, cardiovascular effects, poorer work and school performance, and the most obvious risk: hearing impairment. Noise has emerged as a leading environmental nuisance in the World Health Organization’s European region, and the number of public complaints about excessive noise is growing. Read More …

PearPop wants to boost your social following by connecting you to TikTok stars for collabs

In the social media ecosystem, there are influencers seeking new revenue streams and aspiring influencers looking to grow their followers. PearPop wants to be the bridge that connects the two. PearPop, which launched last October, is a platform where users pay TikTok influencers to collaborate on content. The influencers set their price for a duet, stitch, or sound (prices range anywhere from $15 to $3,333 per post), and users have the option to pay outright or bid a higher amount if there’s strong demand. In turn, that access to top influencers could boost a growing account. It’s an idea that’s catching on with investors and creators. PearPop recently announced raising $16 million in a Series A led by Alexis Ohanian’s Seven Seven Six, with angel investors including Gary Vaynerchuk, Sean “Diddy” Combs, Mark Cuban, Snoop Dogg, and YouTube star Jimmy Donaldson, aka MrBeast. PearPop currently has more than 10,000 creators on the platform (including such celebrities as Heidi Klum, Snoop Dogg, Shaquille O’Neal, and Kerry Washington) and has facilitated more than 1,000 transactions. (The company takes a 25% cut.) These early collabs have yielded some success stories. Model Leah Svoboda went from 20,000 to 141,000 followers after a PearPop duet with Anna Shumate (10.2 million followers). After musician Tobias Dray collaborated with Katelyn Elizabeth  (1.6 million followers) for $25 using one of his tracks as a sound on TikTok, that song got a bump from being used 30 times to 671. “I always thought there should be a way to pay someone to collaborate with you directly,” says Cole Mason, founder and CEO of PearPop. “It blew my mind that there wasn’t a way to do that.” Making a market Cole Mason [Photo: courtesy of PearPop] It’s easy to compare PearPop to the celebrity shout-out platform Cameo , but PearPop is establishing a distinct lane by creating a two-sided exchange with creators: High-level influencers earn revenue and budding influencers gain social capital Read More …

We don’t need weak laws governing AI in hiring—we need a ban

Sometimes, the cure is worse than the disease. When it comes to the dangers of artificial intelligence, badly crafted regulations that give a false sense of accountability can be worse than none at all. This is the dilemma facing New York City, which is poised to become the first city in the country to pass rules on the growing role of AI in employment. More and more, when you apply for a job, ask for a raise, or wait for your work schedule, AI is choosing your fate. Alarmingly, many job applicants never realize that they are being evaluated by a computer, and they have almost no recourse when the software is biased, makes a mistake, or fails to accommodate a disability. While New York City has taken the important step of trying to address the threat of AI bias, the problem is that the rules pending before the City Council are bad, really bad, and we should listen to the activists speaking out before it’s too late. Some advocates are calling for amendments to this legislation , such as expanding definitions of discrimination beyond race and gender, increasing transparency, and covering the use of AI tools in hiring, not just their sale. But many more problems plague the current bill, which is why a ban on the technology is presently preferable to a bill that sounds better than it actually is. Industry advocates for the legislation are cloaking it in the rhetoric of equality, fairness, and nondiscrimination. But the real driving force is money. AI fairness firms and software vendors are poised to make millions for the software that could decide whether you get a job interview or your next promotion. Software firms assure us that they can audit their tools for racism, xenophobia, and inaccessibility. But there’s a catch: None of us know if these audits actually work. Given the complexity and opacity of AI systems, it’s impossible to know what requiring a “bias audit” would mean in practice. As AI rapidly develops, it’s not even clear if audits would work for some types of software. Even worse, the legislation pending in New York leaves the answers to these questions almost entirely in the hands of the software vendors themselves. The result is that the companies that make and evaluate AI software are inching closer to writing the rules of their industry. This means that those who get fired, demoted, or passed over for a job because of biased software could be completely out of luck. Read More …