around the country and around the world can virtually step into the Apollo 11

Around the country and around the world can virtually

This preview shows page 24 - 28 out of 34 pages.

around the country — and around the world — can virtually step into the Apollo 11 and experience it for themselves. The hardware involved in AR: Before we jump into development tools and platforms, it’s important to briefly address the hardware on which AR experiences are being deployed. Smartphones and VR headsets and the primary devices currently using AR. While mobile is the more popular and consumer-friendly option, VR headsets offer a truly immersive AR experience Some of the leading VR Headsets include Microsoft’s Hololens , Facebook’s Oculus , and the HTC Vive . To bring AR experiences to these devices, in the past few years, Google and Apple have been actively developing AR tools and frameworks for their respective mobile platforms that is Android and IOS. But before we jump into what Apple and Google are up to, it’s worth taking a look at some of the most important publishing/content platforms for AR. Snapchat , Instagram , and Facebook are the largest platforms on mobile for AR camera filters. Let’s take a look at how you can develop and publish on these platforms. Snapchat
Image of page 24
Snapchat has been leading in terms of innovation with AR, providing users with everything from multiplayer AR games to pet filters . Snapchat started allowing third-party creators to publish lenses (their term for filters) in 2017 when it launched its lens creation tool called Lens Studio . Since then, more than 400,000 lenses have been created and have been used more than 15 billion times. The latest version (v2.0) of Lens Studio allows anyone to create and publish lenses with ease. The platform is very user-friendly and one can even develop a fully-functional filter within seconds, without writing a line of code.
Image of page 25
Lens Studio The Lens Studio documentation is awesome and super easy to understand. Almost all filters are created using sample templates provided by the Snapchat team, which can be customized by simply dragging and dropping objects and images on the screen and toggling their properties. Advanced functionality and different interactions can be achieved using scripts written in JavaScript. Lenses can be tested before publishing in the Snapchat app itself via the Lens Studio “push to device” option. Snapchat filters by developers can be unlocked for 48 hours by scanning the snap code(QR code) of the Lens or by using a direct link. Personalization, instant expertise, game changing user experiences — these are just a few of the values machine learning can add to mobile apps. Subscribe to the Fritz AI Newsletter to discover more.
Image of page 26
Instagram & Facebook The Facebook family of apps lags behind Snapchat in innovation; however, they are far ahead in approachability. Instagram & Facebook together feature a user base of over 3 billion people. The tool used for creating and publishing on these platforms is Spark AR Studio . At the F8 developer conference this year, Facebook announced that more than 1 billion unique people have used filters created using Spark AR studio in the last year alone.
Image of page 27
Image of page 28

You've reached the end of your free preview.

Want to read all 34 pages?

  • Fall '17
  • Rabia FAisal

  • Left Quote Icon

    Student Picture

  • Left Quote Icon

    Student Picture

  • Left Quote Icon

    Student Picture

Stuck? We have tutors online 24/7 who can help you get unstuck.
A+ icon
Ask Expert Tutors You can ask You can ask You can ask (will expire )
Answers in as fast as 15 minutes