Watch the ‘Perseverance’ rover land on Mars in this just-released video

Since we began sending probes to the surface of Mars, our experience of their landings was a nail-biting silence, punctured only by a NASA Mission Control engineer announcing milestones in the spacecraft progress. That all changed with the Mars 2020 Perseverance rover filming its February 18 landing . Six of the 23 onboard commercial cameras shot high-definition footage of the supersonic descent—dubbed the “ 7 minutes of terror “—and first surface movements. Three cameras trained on the parachute, while another three videoed the descent stage, rover, and approaching ground. Jet Propulsion Laboratory (JPL), the Pasadena, California, facility that built the rover and manages the $2.7 billion mission, premiered its high-resolution video during today’s briefing . This marks the first time we’re able to watch a spacecraft land on another planet. “These images and videos are the stuff of our dreams,” said Mars 2020 entry, descent, and landing (EDL) lead engineer Allen Chen. “I just couldn’t believe my eyes; the images were better than I could have imagined,” JPL’s Adam Nelessen told Fast Company about his initial reaction to the footage. An EDL lead systems engineer, Nelessen focused on the EDL camera technology. “There is a lot that we can learn from the imagery. One of the best engineering outcomes is going to be recording the inflation of the parachute at a high frame rate. We’re going to learn just how well this thin piece of fabric is actually performing.” This is also the first time EDL engineers have seen the landing process unfurl in its entirety, as they were only able to run tests in separate stages on Earth. The footage revealed that the EDL navigation system came to within 16 feet of its landing target. The video also gave a better sense of the debris that kicks up during landing, particularly as NASA looks to land increasingly heavier items on Mars. “We worry about dust and sand confounding radar sensors and making our landing more difficult,” he adds. “So seeing what the dust environment and hazards are like in the area have really good engineering uses for us.” Plus, observing the landing site on approach offers a head start on how to best navigate the area to achieve the science objectives. More raw images of Mars can be found here . High-resolution photo from the descent stage camera of Perseverance being lowered to the Martian surface via the sky-crane mechanism Read More …

I’m an ethical hacker. Here’s how I could use social media to scam you

Scam emails aren’t what they used to be. Gone are the days of fraudulent emails filled with typos and Nigerian princes promising riches if only we hand over our credit cards. Today’s phishing emails can be quite convincing, often addressed to us by name or with specific personal details. Modern hackers can find everything they need to know about a potential target through Google or social media and use this information to architect the perfect scam. Read More …

How to prevent the next GameStop disaster

The mind-numbing inanity of last week’s GameStop hearing on Capitol Hill was just as predictable as the worthless result. Of course members of both parties wanted in on the media frenzy surrounding Robinhood and WallStreetBets, the Reddit forum where thousands of amateur investors mounted a historic campaign to pump (and dump) the stock of a left-for-dead video game retailer. Talking heads on CNBC were alarmed, and so the House Financial Services Committee ordered hearings, subpoenaed witnesses, and played for the cameras at every turn. By the end of last Thursday’s spectacle, the consensus was clear: We learned absolutely nothing. Not surprisingly, Congress focused on the wrong culprit. Yes, Robinhood’s marketing as “the platform for the average investor” ended up conflicting with their treatment of the average investor once they had to stop taking GameStop trades, making them look like greedy hypocrites. (Fast Company has a brief explainer here .) And yes, the use of Reddit and Twitter to drive market forces and propel certain stocks is new and a little scary. But Robinhood, Reddit, and Twitter were all using their platforms in the exact ways they were intended: to spread and drive information and access. If there’s a villain in the GameStop saga, it’s the federal regulators—in this case, the Securities and Exchange Commission (SEC)—who failed to notice that the world was changing and didn’t bother to update the rules accordingly. By definition, regulation will always lag behind innovation. Regulators can’t know what rules are needed until an entrepreneur first thinks of the new idea, turns it into actual technology, turns that technology into a business, and then starts selling its product or service. But once that happens, it’s not necessary to wait for a debacle before updating the rules. In the case of GameStop, the two-day settlement requirement meant that Robinhood couldn’t keep taking trades absent raising more capital. That two-day waiting period made sense in a previous era—one before blockchain and the cloud. But that waiting period still exists because of inertia and complexity—and, historically, because it produced extra revenue for brokerages—not because it’s technologically necessary. Real-time settlement is not only feasible, it would have prevented all of the harms caused to Robinhood’s investors. The SEC knows that, but it didn’t act on it. That was a mistake. GameStop is but one example. Take something more significant like self-driving cars and trucks Read More …

A patent shows how facial recognition drones could identify you from above

An Israeli biometrics startup called AnyVision with ties to Israel’s military has applied for a U.S. patent on technology that tells drones how to maneuver to capture better facial recognition images of people on the ground. Facial recognition technology has become widely used by law enforcement around the world, but the technology is controversial in part for its accuracy issues, especially when recognizing Black and brown faces. Activists are now calling for ending its use entirely , and police use of facial recognition has already been banned in a host of U.S. cities. The patent application, titled “Adaptive Positioning of Drones for Enhanced Face Recognition,” describes a computer vision system that analyzes the angle of a drone camera in relation to the face of a person on the ground, then instructs the drone on how to improve its vantage point. The system can then send that image through a machine-learning model trained to classify individual faces. The model sends back a classification with a probability score. If the probability score falls below a certain threshold, the whole process starts over again. A future defined by this type of mass surveillance would “obliterate privacy and anonymity in public as we know it,” said Kade Crockford, head of the Technology for Liberty Program at the ACLU of Massachusetts who’s led the charge on banning facial recognition in Massachusetts cities, in an interview with Fast Company last year. “Weirdly this is not a hugely controversial issue for voters. People don’t want the government to be tracking them by their face every time they leave their house.” People don’t want the government to be tracking them by their face every time they leave their house.” Kade Crockford As with any patent application, there’s no guarantee the technology will show up in a real product. But it does address a very real technical problem with existing facial recognition systems. Such systems usually process images captured by stationary cameras. Capturing a clear angle on someone’s face, and compensating for bad ones, is always a challenge with these systems. Shooting video from drones that can move around and intelligently zero in on the right angle is a way of taking the chance out of the process. The application, which was originally reported  by  Forbes cybersecurity writer Thomas Brewster, was filed last summer and published by the U.S. Patent Office on February 4. AnyVision , which was founded in 2015, sells artificial intelligence designed to let cameras in retail stores recognize the faces of people on “watch lists” who have been convicted of theft in the past. Read More …

Stitch Fix’s former data chief wants to personalize your food at Daily Harvest

Brad Klingenberg, who oversaw Stitch Fix’s formidable personalization and curation efforts as the company’s Chief Algorithms Officer, is taking on a new challenge: tailoring salads, flatbreads, smoothies, and desserts exactly to your taste at Daily Harvest. The six-year-old company makes easy-to-blend smoothies, veggie-filled harvest bowls, soups, lattes, oat bowls, and healthy desserts that can be made by simply warming them up or popping them in a blender. Customers order Daily Harvest’s offerings by signing up for a weekly or monthly plan and receiving deliveries already portioned and ready to eat. As the company’s first Chief Data Officer, Klingenberg hopes to use the data collected by the company to make its meals even more delicious. At San-Francisco-based Stitch Fix, which Klingenberg joined more than seven years ago as the third person on the now 145-strong data team, algorithms learn people’s preferences over time by getting feedback on the clothes customers receive and purchase and through data on the website. Eventually, the algorithms learn to generate tailored recommendations and even advise vendors on possible clothing alterations. If many customers think a sweater is too short, for example, they might relay that information back to the manufacturer to change the design. “There’s almost no corner of the business that’s not touched by data science in some way,” Klingenberg says. That includes recommending clothes for users’ curated clothing boxes, as well as optimizing buying and inventory, and even developing new clothes Read More …