Don’t get too excited about Apple Music’s ‘spatial’ and ‘lossless’ music

I’ve often gushed about my admiration for Apple’s commitment to music. The company employs a lot of musicians or ex-musicians, and even more music lovers. It’s not trivial: It says something about the company’s culture and the way it approaches creativity and collaboration. Apple has obviously made many important music-related announcements in its time, but this week’s announcement about Apple Music offering “lossless” and “spatial” audio probably won’t end up rocking the world. Spatial audio Apple has been working with Dolby to begin making some of the Apple Music catalog available in Dolby’s proprietary Atmos format. Those recordings are meant to sound something like the experience of watching a movie with surround-sound technology, where sounds might come from behind you, above you, or anywhere else within a spherical audio surface around you. And sounds can move around in that space, so a guitar solo might seem to slowly circle above your head (which is cool, because guitar solos are boring). Apple says it’s going to start off with a few thousand Atmos songs in June, including some from Ariana Grande, Kacey Musgraves, and others, and then add more tracks over time. When the spatial support launches next month, Apple devices will be set to play available songs by default, rather than the regular binaural mix. I’ve no doubt the Atmos mixes themselves will be true to the spatial concept. Read More …

Google’s new AI dermatologist can help you figure out what that mole is

On Tuesday, at Google’s annual developer conference I/O, the company announced the launch of a new search tool for skin, nail, and hair conditions to serve the two billion people around the world who suffer from them. The technology, validated in a paper published in Nature last year , is nearly as good as a dermatologist at identifying 26 skin conditions, and more accurate than the primary care physicians and nurses in the study. The new search tool, which will launch later this year, serves as another example of how the company thinks that it can support doctors and patients through everyday products. The dermatological assistant lives inside of Google Search and requires a 3G minimum connection. To use it, a person must provide consent and then upload three well-lit photos. The program will ask them a series of questions about their condition. You can bypass this section, but Google product manager and physician Dr. Lily Peng says answering these questions will make the results more accurate. Afterwards, the tool will serve up a list of possible matches with the top three being the most likely culprits. If the AI is less confident in its suggestions, it will note that it is still learning about certain conditions. In addition to skin conditions, the tool will show articles and other related content. [Image: courtesy of Google] Users can then either save their results, delete them, or donate them to Google’s internal research efforts. For those that choose to save or donate, the data is encrypted both in storage and transit, and the company says it will not use the data to target ads. [Image: courtesy of Google] During the three years of research and development that went into the tool, Google trained its dermatological assistant on millions of de-identified skin images. To ensure its technology worked across skin type and tone, Google partnered with 17 clinics to bring in 65,000 de-identified photos of patients’ skin. It can now identify 288 skin, hair, and nail conditions of the over 3,000 conditions that fall within the purview of a dermatologist, according to the American Academy of Dermatology Association . Google says its dermatological assistant is not a diagnostic tool, though both the U.S Read More …

For Google Maps’ trickiest challenges, AI is the answer

Every time you ask Google Maps to provide driving directions, it considers many options and selects one as the optimum route. Naturally, getting you to your destination in an efficient manner is a primary goal. But when you set out on a trip, efficiency isn’t the single most important factor. Above all, you’d like to get there safely. That’s the premise behind a new feature that Google unveiled today during this year’s online version of its I/O developer conference . Google Maps will now identify road segments where drivers tend to slam on their brakes. It will try to route you around such areas even if they’re theoretically part of the most obvious route. Figuring out where the danger zones are so you can avoid them is “one of the most complex problems I’ve been lucky enough to tackle in my time at Google,” says director of product Russell Dicker, who’s worked on Maps off and on for seven years. The company solved it by applying AI to data, as it’s been doing with a bevy of other recent and upcoming tweaks to the world’s most popular mapping app. It’s pretty obvious why hard braking might be a sign of dangerous stretch of road: It’s evidence that drivers are reacting to something unexpected. And if everyone involved doesn’t react quickly enough, the result can be an accident. Indeed, Dicker says that the inspiration for the new Google Maps feature came from an incident a couple of years ago when a Google Maps product manager rear-ended his father’s car at “this intersection with one of those super-short yellow lights.” Everyone was okay, but the mishap led the Googler to delve into the topic of hard-braking incidents— the subject of considerable research by organizations such as the Virginia Tech Transportation Institute. We think that we’re going to have the ability to potentially eliminate around 100 million hard-braking events.” Russell Dicker, Google The more the Maps team looked into the issues that can lead to hard braking—which range from road geometry to sunlight hitting drivers in the eyes—the more comfortable it felt factoring them into its driving directions. “We’ve seen that there can be a sudden increase in hard-braking events along a segment when it’s raining extra hard,” says Dicker. “And so this was the next ‘Aha’ moment for us, because we realized that understanding environmental factors and helping people navigate them successfully was what Google Maps has done for years.” So how does one identify roadways that are prone to hard-braking incidents? Google had an obvious opportunity to collect relevant data: The Google Maps app runs on smartphones equipped with accelerometers, allowing it to detect motion or the abrupt lack thereof. But phones aren’t bolted to vehicles; they’re subject to independent movement of their own within the cabin. That meant that raw accelerometer data was of limited value. Google discovered a workaround in the fact that a decent chunk of Google Maps navigation involves Android Auto —the feature, built into many recent vehicles, that lets you project apps from your phone onto a dashboard touchscreen. A phone that’s powering an Android Auto session is at least tethered to the vehicle it’s in, and Google found that it provided more robust evidence of hard braking Read More …

How Google’s new ‘MUM’ algorithm could transform the way we search

Google is flexing its artificial intelligence muscle to help users of its search engine research complex tasks that would normally involve multiple queries. Many of the Google searches we do are just a single query, such as “file a request for extension federal tax.” But other searches involve several searches about different aspects of a complex task. You might, for example, want to know how to prepare for a river rafting trip in Montana in August, and how the preparations might differ from the preparations you did before your Colorado River rafting trip last fall. If you asked a local rafting expert how to prepare you might get an extended answer that covers a range of relevant questions. Will the weather be hotter than it was in Colorado Read More …

Amid worker and regulator complaints, Google is facing a turning point

By any measure, Google is a colossus of the tech industry, with a market capitalization of nearly $1.5 trillion , a massive army of lobbyists , and elite academics at its disposal . But lately, its reputation has been hurt by a highly publicized feud with well-respected ethical AI researchers, and revelations about its toxic workplace, previously hidden under NDAs , are roiling the tech giant’s PR-spun Disneyland-like facade. Now, it’s facing a multitude of challenges including talent attrition, resistance from an increasingly influential union, and increased public scrutiny. Privacy-centered competitors are nipping at its ankles, antitrust regulations loom on the horizon, and user interest in de-Googling their online activities is mounting. These headwinds are threatening the tech giant’s seemingly unassailable industry dominance and may bring us closer to a “de-Googled” world, where Google is no longer the default. At war with its workers In December 2020, the tech giant dismissed eminent scholar Timnit Gebru over a research paper that analyzed the bias inherent in large AI models that analyze human language—a type of AI that undergirds Google Search. Google’s whiplash-inducing reversal on ethics and diversity as soon as its core business was threatened was not entirely surprising. However, its decision to cover this up with a bizarre story claiming that Gebru resigned sparked widespread incredulity. Since Gebru’s ouster, Google has since fired her colleague Margaret Mitchell and restructured its “responsible AI” division under the leadership of another Black woman , now known to have deep links to surveillance technologies. These events sent shock waves through the research community beholden to Google for funding and triggered much-needed introspection about the insidious influence of Big Tech in this space . Last week, the organizers of the Black in AI, Queer in AI, and Widening NLP groups announced their decision to end their sponsorship relationship with Google in response. While the prestige and lucrative compensation that comes from working at Google is still a huge draw for many who don’t consider these issues a dealbreaker, some, such as Black in AI cofounder and scholar Rediet Abebe , were always wary. As Abebe explained in a tweet, her decision to back out of an internship at the tech giant was triggered by Google’s mistreatment of BIPOC, involvement with military warfare technologies, and ouster of Meredith Whittaker , another well-known AI researcher who played a lead role in the Google Walkout in 2018 . Abebe is not the only one who has decided to walk away from Google. In response to this latest AI ethics debacle, leading researcher Luke Stark turned down a significant monetary award , other talented engineers resigned , and Gebru’s much-respected manager Samy Bengio also left the company. A few years back this level of pushback would be unimaginable given Google’s formidable clout, but the tech giant seems to have met its match in Gebru and other workers who refuse to back down. Even with its formidable PR machinery spinning out an announcement touting an expanded AI ethics team, the damage has been done, and Google’s misguided actions will hurt its ability to attract credible talent for the foreseeable future. More ex-employees are also coming out with details of their horrifying experience s, adding fuel to the rising calls for better employee protections. These disclosures have renewed support for tech workers as hundreds of Google employees unionized after many years of activism, despite union-busting efforts by their employer. Read More …