Snapchat unveils AR features that let users decorate landmarks and use their voice to change effects

Snapchat unveils augmented reality features that let users decorate landmarks in their neighborhood and use voice commands to change effects in camera mode

  • Snapchat announced voice activation for its AR lenses
  • Users can ask the app to change the filters using voice commands
  • It also rolled announced collaborative AR landscapes called ‘Local Lenses’
  • The feature allows users to modify entire neighborhoods in AR 
  • Here’s how to help people impacted by Covid-19

Snapchat will offer users and developers a host of new ways to use and create augmented reality features.

At the company’s second annual Partner Summit, which was held digitally due to restrictions caused by COVID-19, Snapchat announced several new augmented reality features, including voice activation for its AR filters.

With a new feature called ‘Voice Scan’ users will be able to tap and hold on the camera screen to use voice commands that change AR filters.

Scroll down for video. 

A new feature called ‘Voice Scan’ will allow users to activate different AR effects in the camera mode using their voice. Pictured: a demonstration in which a user asks Snapchat to make their hair pink

For instance, Snapchat demonstrates how a user could leverage the feature by activating Voice Scan and then saying, ‘Hey Snapchat, make my hair pink.’

To power the feature, Snapchat partnered with voice recognition company, SoundHound which has previously worked with automakers like Hyundai, Mercedes-Benz, and Honda to bring voice integration to their vehicles.

In addition to Voice Scan, Snapchat also announced partnerships with PlantSnap and DogScanner which allow users to identify plants and trees and dog breeds respectively. 

A partnership with food and nutrition technology company, Yuka, will also allow users to scan food packages to pull up nutrition information in a feature aptly dubbed, NutritionScanner.

Outside of new features for ‘Scan’, Snapchat also gave a preview of ‘Local Lenses’ which allows users to create their own shared AR worlds.

Local Lenses (pictured) lets users paint entire neighborhoods and builds upon a previous feature that transformed landmarks in AR

Local Lenses (pictured) lets users paint entire neighborhoods and builds upon a previous feature that transformed landmarks in AR

Local Lenses will enable what Snapchat is calling a ‘persistent, shared AR world’ in which users are able to use the app to decorate and modify shared spaces in a community by, for example, virtually painting a building.

The new feature will build off of Land Markers, which superimpose AR effects onto famous landmarks around the world.

Instead of just being limited to landmarks, however, Local Lenses will apply to entire neighborhoods and allow users to paint and modify their environments collaboratively. 

The feature uses pulls on data sources like 3D images and community snaps to digitally map and reconstruct environments so that users can interact with entire city blocks. 

Snapchat also announced more developer-focused features around its augmented reality pursuits, including an expansion of its Lens Studio which enables people to build their own AR filters.

Developers will now be able to take advantage of SnapML (short for machine-learning) which lets developers use their own neural networks to power lenses. 

It’s unclear exactly what the capability will yield, but the ability for developers to use their own AI to develop lenses inside Snapchat could greatly expand the sophistication of augmented reality effects created by Snapchat’s own community.

To start, Snapchat will offer foot-tracking effects that allow users to try on shoes in augmented reality.