In this latest DeepAR release, we have made significant strides in improving stability, ease of use, and integration.
Our ultimate goal is to make AR technology accessible to all, which is why we have worked tirelessly to enhance the user-friendliness of DeepAR. In fact, we have made great progress in simplifying the installation of our DeepAR library. Moreover, we have incorporated popular, out-of-the-box, and easy-to-use AR functionalities into the library, so businesses can focus on delivering a top-notch user experience while we take care of the complexities of AR technology.
📦 DeepAR for iOS is available on CocoaPods and Swift Packages.
📦 DeepAR for Android is available on the Maven repository.
👤 Easy to use background blur and background replacement API.
👟 Improved Shoe Try-on and Watch Try-On detection and tracking.
One of the most important features of any library is how easy it is to install. For app development, smart and easy dependency management is crucial. With the release of version 5.3.0, DeepAR is available on the most popular dependency managers.
Can you imagine modern video calling without the option to blur or replace your background? For a long time now, DeepAR has been providing these features through loading DeepAR effect files. While this approach provides great flexibility and robustness in creating all sorts of AR effects and filters, it has one major snag when it comes to background effects: you had to create multiple effect files for different blur strengths or background images.
Although you could get around creating multiple effect files by leveraging our DeepAR change parameter API, it’s admittedly a bit tricky to grasp, especially if you're new to AR. But guess what? We've been listening, and we've been improving.
That's why we created a simple-to-use API that comes built into DeepAR:
Enable background blur on DeepAR Web with blur strength 5.
Enable background replacement on DeepAR Web with an image of a sunny beach.
The DeepAR ML team is constantly working on improving the detection and tracking of various body parts to enhance AR try-on capabilities, provide artists with more room to experiment, and improve the user experience for a variety of ads and try-on partners.
This time, the focus was on improving the ML models for foot and wrist tracking. Once again, our dev team has amazed us with improved performance and accuracy. Shoes now snap to feet better than ever, and we are particularly proud of improving wrist tracking in situations where the user is rotating their hand.
Try out our demos:
We have seen loads of exciting try-on experiences being built since we launched shoe try-on for web last year, with a lot of demand to bring it to our iOS and Android SDKs. We are excited to announce that foot tracking for iOS and Android is now in the beta testing phase! If you would like to join the beta, please get in touch explaining your use case for foot tracking and we will prepare an invite when we are ready to expand the programme.
The new release of DeepAR comes with practical enhancements and stability improvements that are worth mentioning. Here's what we've done:
While not a critical update, we believe that good documentation is key to a great developer experience. With that in mind, we're introducing our new DeepAR iOS API reference. Designed to mirror Apple's own documentation style, it provides a familiar and intuitive environment for developers to navigate and find what they need. Notably, we've documented the API in both Swift and Objective-C.
You can access the new documentation here.
We're particularly proud of this update as it seems we're ahead of the curve—few iOS frameworks are currently leveraging these new documentation tools from Apple.
We're currently focused on expanding the capabilities of our beauty API, with an eye on making its integration into your existing apps even simpler.
We understand how crucial accurate face tracking is, especially for applications like virtual glasses try-on. That's why we're putting our efforts into enhancing this feature to deliver the most realistic and smooth experience we can. Our team is in the trenches, developing new algorithms and technologies to make this happen.
We greatly value your feedback and suggestions and we're always open to hearing what you need most from our platform. So, stay tuned for more updates, and as always, thank you for your continued support and trust in DeepAR. Until next time, happy creating!
We write about AR case studies, insights and the newest AR tech we're creating.