Snapchat has shared a slew of announcements today at its Snap Partner Summit event. The headlining new changes include third-party apps integrated into Snapchat called “Minis” coming in July, a new “Happening Now” banner and redesigned bottom “Action Bar,” voice commands for filters, and businesses being listed in the app’s maps feature.
Parent company Snap announced all of the news during its Partner Summit event as well as in a series of blog posts.
For user-facing changes, Snapchat has launched the “Happening Now” banner in the US to easily keep up with breaking news. The feature will roll out in more countries over the next year.
Another update is a refreshed bottom bar or “Action Bar” as Snapchat calls it that makes it easier to access all of the app’s features.
Here’s how the new five-tab design looks:
Meanwhile, businesses will be viewable in the Snap Map with profiles and options for users to place orders right within the app.
Snapchat news for devs
Today’s news includes a good amount of updates for developers. Here’s what’s new:
Camera Kit invites developers to bring Snapchat’s AR camera to their own apps. Soon on Squad, you’ll be able to add fun Snapchat lenses while you browse, chat, shop, and watch videos together. With Triller’s Camera Kit integration, you’ll be able to create your own music videos with special Snapchat lenses.
An even bigger development is third-parties now being able to bring their apps inside of Snapchat with a new feature called “Minis.”
Snap Minis are a new way for developers to bring their services inside Snapchat and empower new, social experiences. We’ve carefully designed Minis to deeply integrate within your conversations, so coordinating with friends is faster than ever.
Other improvements include Bitmoji for Games and Dynamic Lenses. Interested developers can get learn more about Snap Kit and the new additions here.
https://www.youtube.com/watch?v=l7cd65DdP2w&feature=youtu.be
More news includes SnapML launching for any lens creator to use, Local Lenses, new Scan partners including the ability to use voice commands with “Voice Scan” powered by SoundHound:
Today, we’re introducing SnapML, which lets any developer bring their own machine learning models directly into Lens Studio to power Lenses. Now, anyone can create their own Lenses with neural networks, expanding the possibilities for Lenses that can transform the world. We partnered with Wannaby, Prisma, CV2020, and Official Lens Creators on the first SnapML creations.
Here’s how Local Lenses will work:
Local Lenses enable a persistent, shared AR world built right on top of your neighborhood. You and your friends can step into augmented reality together to decorate nearby buildings with colorful paint and experience a new dimension of AR.
…
Additionally, we’re introducing Voice Scan, which offers Lens results based on your voice commands. Powered by our partnership with SoundHound, press and hold to tell Snapchat what kind of Lens you want to see.
FTC: We use income earning auto affiliate links. More.
Comments