People + AI Research (PAIR) Collection
Neural Networks. Teachable Machines. The UX of AI. No longer relegated to sci-fi movies, these terms and the concepts behind them are increasingly part of our day-to-day experiences. For UX designers and the people building this tech, it’s mission-critical to understand the field’s challenges, the effect of bias, and how to use a human-centered design process. Enter People + AI Research (PAIR)—the Google team focused on surfacing articles, resources, and frameworks that do just that. We’re especially amped about their new collection on Google Design. As a resource, it’s one of the only places you can find practical insights on designing with ML, and case studies that unpack the thinking and design decisions behind real products—like Google Clips and Emoji Scavenger Hunt (🕵️♀️). With a throughline on building more inclusive tech, this collection is particularly resonant in 2018. We’ve bookmarked it and so should you.
Who’s that doggie in the window? Whether you need the deets on your neighbor’s pup (turns out, it’s a Shiba Inu) or help remembering what bouillabaisse actually looks like, Google Lens lets you search IRL. Simply point your camera, tap, and then watch as Lens drops some knowledge you can use—the Shiba Inu is a small dog that copes very well with mountainous terrain—on the spot. Lens launched in Google Photos and the Google Assistant last year, and some of the big news for 2018 is that the technology is now available directly in Pixel’s camera app (and other supported devices), so now you’ve got state-of-the-art ML in real time, wherever you may be.