Streaming is available in most browsers,
and in the WWDC app.
and in the WWDC app.
Use the developer tools in the Develop menu in Safari on Mac. If you’re a web developer, the Safari Develop menu provides tools you can use to make sure your website works well with all standards-based web browsers. The Apple Watch is coming and this week the company launched their developer’s toolkit the “Apple WatchKit” for all those keen to build wrist-bound applications. What can you learn about the new product from the developer’s kit? Same, Same not Different. The watch is going to work rather like the Galaxy Gear.
- Reality Converter and Reality Composer make preparing augmented reality assets for your iOS or iPadOS app easier than ever. Discover how you can convert existing 3D assets into USDZ, bring them into Reality Composer to create AR experiences, and integrate with an existing Xcode project or export to AR Quick Look. We'll detail how to work with assets in a variety of popular 3D formats, build and test your AR scenes, and integrate with your artist's workflow. To get the most out of this session, you should be familiar with USDZ and Reality Composer. Watch “Working with USD” and “Building AR Experiences with Reality Composer” for more. And to get more details about the latest additions to USD for AR, check out “What's new in USD”.
Resources
Related Videos
WWDC 2020
WWDC 2019
- DownloadHello, and welcome to WWDC.Hello, everyone. My name is Philip Simmons, a designer here at Apple. Thank you so much for taking the time to tune in for my talk on the artist's AR toolkit. So today my talk is gonna focus on the artist's journey of creating compliant USDZs for use in Reality Composer. This will cover what file formats you should export from your DCC, how to bring those assets into Reality Converter, and make them into USDZs. Later, we'll look at using our assets inside Reality Composer and setting up some complex behaviors to create rich experiences. Lastly we'll send all this data to device and view our newly created AR project.So what does all this look like? So from a high level, this is the journey we're gonna take. It is a straight line from your DCC of choice into Reality Converter, then Reality Composer, and, lastly, onto device for viewing and sharing.So Reality Converter is a major step in this process, and we need to know a few things before we can begin pushing data through it. So the primary things to note are gonna be geometry types that can be imported and the texture types that are supported.So for geometry, we're gonna have FBX, OBJ, USD and GLTF. For textures, we have PNGs and JPEGs. So these inputs combined allow Reality Converter to output a valid USDZ.Since Reality Composer only accepts USDZ files, Apple has provided the tools necessary for developers to generate and inspect USDZs.So last year, Apple offered a suite of Python-based tools so developers could get started making their own content. This year, we're proud to be including Reality Converter as a next generation in that tooling.So both Reality Converter and the USDZ tools are available today through the developer.apple website.All right. So let's get started generating some content. We're going to look at a few examples in different DCCs and send all that data over to Reality Converter.So here, we're gonna begin by taking a look at an asset inside Houdini and exporting it out as a USD for use in Reality Converter. Now Houdini is known for its powerful simulation tool set, and in this demonstration, I will be scratching the absolute surface of that toolset to show you how we can take one asset, instance it around, and export those instances out as individuals in a hierarchy. Now this is an important distinction, because later I'm gonna show you how to reach into that USDZ hierarchy and apply behaviors directly to individuals instead of the asset as a whole.So let's take a look at what I've set up here. Here I have the cloud geometry. You can see it in the viewport as we tumble around. Now when we jump into the geometry node, you can see what I've actually created. Here I'm bringing the geometry on the left, and down the right-hand side I've built a graph. Now this graph is essentially building a volume, scattering particles throughout it, and then instancing our cloud at each one of those points. So if we look further down the chain what our combined result is, I have clouds scattered through this volume. Now this is a live graph, so I can jump back in and update the scatter node to have more or less points. Here I'll drop the count from ten to eight. So, all together, we can go ahead and start exporting. So for export, I'll jump over to the staging area. Here I've set up a USD rot node. This is being fed our instance geometry and our particles. Now I'm combining these in such a way that keeps the instances in a hierarchy, as opposed to merging it as a solid piece of geometry. So with my rot node set, I can go ahead and save this to disk.So with our newly-exported asset, now is the perfect time to fire up Reality Converter and get to know the app. We will go ahead and begin with our cloud asset and get a feel for how Converter's gonna work, and then we'll loop back and gather the other asset types for conversion.So here we are with Converter loaded up.In Converter, it starts with a blank scene, and it's asking you to drop the files. So let's do that. We'll navigate on disk and find our cloud USD that we've created. Now we can simply drag and drop this into the viewport and Converter will load up our asset. And here we are.With this asset loaded up, now is a great time to take a tour through the software and get a feel for everything it has to offer.So first and foremost, we're in the 3D viewport. You can rotate, you can pan, you can zoom.Now down the left-hand side, we're gonna have the Models tab. The Models tab will fill up as you drag in more geometries. So once we fill this in with our other geometries later on, you'll see how we can toggle between them.Now this is a 3D viewport, and sometimes you can lose where you're at. So if you're playing around and you happen to zoom away from where you want to be, we've given you a framing icon to give it back.Now down the right-hand side is where you're gonna do most of your work. Now the Environments tab is the tab where we're giving you preloaded IBLs. That's image-based lighting. This is the lighting environments that make your image look how they do. You can simply toggle between them, and find ones that suit your needs. Now if they're too light or too dark, we're giving you exposure controls down the bottom where you can make adjustments accordingly.In addition, we're giving you grid and environment views. Grid is perfect for knowing how large your asset is in relation to a certain scale. So, example, our grid units here are set to one meter. You can see that each checkbox is going to be a meter large.You can also see the environment. This is representing what the IBL is. So the IBL looks like this, and in the 3D viewport, it's gonna look like this. If you find that distracting, you can simply turn it off and get back to a solid color.The Materials tab is where you're gonna hook up all your materials and textures.Here you can see we have two materials noted, but only one of them I actually assigned to my DCC.Converter is giving you a default material as a fallback nicety, so if you download something or you build something and forget to assign a material, we've given you the fallback material that will automatically assign on import.Now I've set up a material, so I know I'm ready to go. We can simply add our textures by dragging them from disk as well.So if we navigate the disk, we can find our base color and drop it on our base color input.There we have it. We can go ahead and fill in the rest of them. We have a normal map, and we also have an ambient occlusion.And that's it. Our material is set up, and our textures are added. So let's go ahead and look at our last tab. Our Properties tab is important. This is where you're gonna add your copyright information and also set up your base units. So for copyright information, you want to let the world know who made this asset. So we'll go ahead and tell them Apple made it. Now base units is an important feature. I know because this asset came from Houdini, it was built in meters. So I want Converter to also export it in meters. Later on, we'll see assets that were constructed in centimeters, and we'll go ahead and change it here to be centimeters for export.So with that all set, we can jump back to our DCCs and grab the rest of our assets.So jumping back to our DCC's Next we are going to look at a Static asset Static assets are the most common type of asset you will encounter, and OBJ is a very popular file format to save static assets to. So nearly every DCC is going to export to OBJ. And that's why you will often find this as the available file type when downloading from online sources.So here inside Blender we can take a look at a static asset. This school asset is comprised of geometry, materials and textures. So here in the viewport, we can see the geometry. Down below, we can see I have a material hooked up with my texture inputs. And lastly, here are the UVs that the textures are being powered by. So all that combined gives our final look in the viewport. So, happy with our asset, we can go ahead and export this as an OBJ and save it to disk. GLTF is our final geometry format that we accept. More and more DCCs are starting to pick it up as an export option, and KeyShot is one such app. So for our off-line render community, you can now easily move your data to a real-time format through GLTF and USDZ.So here in KeyShot, you can see the clock tower. Now like the OBJ example, this is a static asset. So it'll have the same geometry, materials and texture setup as before. Here we can select the geometry in the hierarchy and see further down I've assigned the material.So this material, I have base colors and roughness assigned. Now if we inspect the material itself, you can see where I've plugged in those textures, and all of this combined is what's giving us our final render. So, happy with our asset again, we can go ahead and export this out as a GLTF and save it to disk.So last topic before we go back to Reality Converter is about downloaded content. As you can see, using DCCs is a complex and specialized skill set. So for developers that don't want to learn a DCC, or don't want to make their own art, online resources are available. Some have even started offering USDZs as a downloadable file type, which is perfect for use in Reality Composer, Sketchfab being one such site. But for those times that you find an asset online, and it's not offered in USDZ format, you now have the capability to make it one yourself.So with our new content, let's get back into Reality Converter and make some USDZs.So Converter's still up, loaded and waiting for us. We can go ahead and navigate on disk and find our assets.Now, the first time we did this, we dragged in the geometry and in the texture separately.Pastebot 2 1 2. That was the long way. I'm gonna show you a faster way to get through this software and get it done.So if you select at the folder level, you can see I've exported out our geometry and our associated textures.Now if we just grab things at the folder level, we can drag and drop that directly into Converter, and it'll hook it all up for us. So let's do that.So here we can see, we have our taxi, our clock tower and our schoolhouse. You can see down at the Models tab here that we've given affordance for animated assets. So if you bring in an asset that has animation, it's gonna get this little running man icon. That's how you'll know at a glance which one has animation. So to that end, in the viewport, we're gonna give you the ability to play, pause and stop the animation. So if you hit play, it'll simply loop endlessly.Now same as before, you can toggle through your environment lightings if you want to view it in different lighting setups. You can show the environment and grid. That also works. Same as before.So over to the important part, we'll look at our properties. Properties, like we spoke about before, is where you add copyright information and you set your base units. Now this important because we exported in centimeters from our last DCC packages.Here in Converter, we need to have that reflected. So we'll go ahead and change him to centimeters, and we'll do the same for the clock tower and the schoolhouse.Perfect. So there we go. All our assets are in Converter, materials are hooked up, textures are applied, all our units are set and our copyrights are set. We can go ahead and export. Now you could export one at a time and save it out to disk, but there's a faster way. We'll give you the affordance to simply go file, export all, and you can do it all in one big push. So we'll find a nice place on disk and hit export. Now once the export is complete, you'll get a little check mark next to each one of the icons, and you'll know that that process has been done. We can verify that by jumping to Finder, and we can now see our assets here on disk.So with all our content in USDZ form, we're now good to go with Reality Composer. We can simply spin up a project, drag in our assets to be used. So let's start.Jumping over to Reality Composer, we can bring up Finder and drag in those assets that we just created. So here's where you would know immediately if your unit scales were not set correctly in Converter.Everything is sitting nicely with each other, and it's the right scale and proportion to each other. If it wasn't exported at the right scale, some would be larger by 100X or smaller by 100X.Here we can visually see that everything is working fine.Now our schoolhouse and our clock tower are static assets. So all we have to verify for them is that they look nice.So we can go ahead and view them, scroll around, and everything looks great.Now for our taxi, if we remember, this is an animated asset. Now as we saw in Converter, you can hit play, or in preview you can hit space bar, and it'll automatically animate.In Composer, we don't automatically animate for you. You tell it when you want it to animate. So let's take a look at how that works.We zoom in to our taxi. We'll make it a little bit larger so we can see it.With our taxi, we're gonna need to use behaviors. So let's make a new behavior.So there's a plus sign, go to Custom. We'll give it a trigger. So I'm gonna tell it, on Scene Start, do something. So we'll select Scene Start, and in our sequence, we'll tell it to use a USDZ animation.Now we need to target it. Since I already had the taxi selected when I built this behavior, it automatically applied it to the behavior. So we don't even really need to target it. Now we can hit space bar and simulate, and there we have our taxi moving. Now, by default, it goes one time through the animation and stops.I want this to loop endlessly. So I'm gonna go ahead up here at the action sequence, and tell it to loop endlessly. Now, if we simulate again, our taxi's running, endlessly animating.Perfect. So let's take a look at our clouds. Our cloud asset, if we remember, we set up specifically so we could get at the individual entities.So let me show you how to do that. If you select your clouds, you can right-click and get at Hierarchy Select.Here's where you can expand the USD scene that we've created. And this is how you can get at the individuals inside and do individual things.So with right-clicking and doing Hierarchy Select, I can move things around, I can scale them, and, more importantly, I can apply behaviors to them.Great.So with that understanding, let's take this all over and use it in context of a real AR experience.So if we go to our scenes, I've already put one together.So here's a little town I put together using our assets, in a more contextual way.If we hit simulate, you can see I've already given it some ambient behaviors.So let's take a look at our taxi asset to begin with.This particular taxi asset is slowly approaching, driving down the street, and then disappearing. Now, what's really happening under the hood is what this other one is doing. This other taxi is moving from point A to point B, and then rewinding and doing it again.But that doesn't look so great. So let me show you how to add those extra niceties to make this look like a real smooth transition in, transition out, and a loop, so as if it were ambient taxis coming and going through your scene.So let's jump down to the Behaviors and take a look.In our sequence, you can see I'm doing a move-to. This is moving a taxi from point A to point B.Now, after that, I'm simply giving it another move-to to tell it to go back to where it started from. So we need to add a couple more cards here to smooth out that transition.So if we hit the plus sign, we can pick a Show, we can also pick a Hide.Now, behaviors are evaluated left to right. So we're going to need to move these cards into position so that they function correctly. So I'll select my Show card, and move it to the beginning, 'cause that's the first thing I want it to do.Of course, we'll choose our asset that we want it to be triggered at.We're also going to want to hide it. So our Hide is not the last thing we do. It's actually the third to last thing. So we're going to put it in place.Then again, choose, and assign it to the taxi. Now, let's take a look at what that did for us.Fades in, it drives, and once it gets to the end it's going to fade out, and rewind itself. But it doesn't quite look like the other one.That is because Show and Hide cards have different ways of showing and hiding. And that's an important distinction. So let's scroll down a bit further and check out their options.If we give it a motion type, we can tell at what direction we want it to appear from.So let's try Move from Front. Now, I don't actually know what direction that's going to go, so that's why we have these little play buttons. And we can check and see what this card does in context. So I hit Play.And it got lucky. So let's take it. So we can change the duration for how long we want it to take to show up. Let's say two seconds. We can also tell it how far we want it to go to show up. So let's say something like 25 centimeters.And again, we can preview it to see if that looks about right. And perfect. So last note on the card, we're going to want to change the animation type. We don't want to ease in and come to a stop before the next card takes over. We want it to ease in and then continue on into the next animated card. So we're going to change its easing type to simply be Ease In. And then, we're going to let the next card have no easing type. So it just smoothly transitions across the whole thing.For the Hide Out, we're going to do the exact same thing. Whereas this one was Move to Front, this one will be Move to Rear.We'll change it to have the same time.And the same distance.There we go. And on this easing type, we'll do Ease Out, since it's slowly leaving the scene.Now if we hit Play, we should have something a little more similar to what the other taxis are already set up and doing.It gets to the end.It will smoothly drive off and disappear. It will then rewind itself, and begin again.Perfect. So that's the basic concept of making something show, move, hide, and repeat itself. We can build on these basic concepts to make something a little bit more complex. Let's take a look at this police car.This police car shows up, and moves.Exactly how we did our taxi.Now, when we get to the end of the street, he's going to make a turn.This turn is a little bit more complex to set up than just telling it to go from point A to point B, because that would be a linear move, and we want it to do an arc. So let's check it out.If we scroll down here to the police car's sequence, we can see I'm doing a show, a move, just like we had before, now, I'm going to do an orbit and another move. Now, what am I orbiting? I'm orbiting a tiny piece of geometry that I have hidden inside another area.So an Orbit card is going to require you to have an affected object, and a center object. So if I move this building, you can take a quick look, I have a pill hidden inside this building. That is what I'm using as its area to orbit.Let's go ahead and put our building back. And you can see how that works.The car drives, once it gets to the end of the street, he's going to then orbit that pill .25, because I don't want to do a full circle, just a simple turn. And then, another Move To card is going to take over. And we'll have a Hide card at the end to make it disappear, same as the taxi.So that's an easy way to take a simple behavior such as, like, showing, and moving, and hiding, and giving a couple extra cards in the middle to create a more rich and convincing environment. So let's take a look at our clouds. Like we did before, we can select individual clouds and make them do different things. This would look a little funny if all the clouds simply moved on their own. So let's check out what I've built.In this behavior, I'm using the same gimmick of Show and Hide where I'm telling it to show a cloud, come in on a slow transition, and then hide out on a slow transition.And if we view this sequence, we can see our clouds are doing just that.Now, that's great and looks very convincing. But I want them all to do it. So let's hit Stop, and let's apply more behaviors.Now, where I want them all to do it, I want them all to do it at different rates. So instead of building a new behavior over and over again, we can simply duplicate a behavior, and re-target the cards to different clouds. So let's go ahead and do a Choose, and instead of targeting these again, we'll pick this guy here in the back.And of course, we'll do it again for the Hide.And we can change its timing to be something a little bit longer. Maybe 15, and 15.And let's simulate and see what that did.So now you can see my groups of clouds coming and going. I have three different cloud behaviors targeting different groups of clouds, and they're moving and going at different rates.Perfect.So with all that set up, now's a good time to talk about context. Viewing AR is great in context. So we need to know how big are things? Are things moving quickly or slowly in your actual world? So this is where we have the iOS Handoff. So here, on Mac OS, I can hit Edit on iOS and I can drop this project to my iPad. Then, on my iPad, make edits in context and drop it back to my Mac OS. So let's give that a shot.So here on the Mac OS, we can simply do Edit on iOS, and with Reality Composer launched, you'll see our iPad show up. Now, we can do the Handoff.So we'll simply click ourselves, and Accept on the iPad. Now, it's transferring the project all the way over to my iPad for me to make edits. So we'll simply wait for it to sync, and we'll be good to go.So here we go. We've transferred our file over to our iPad. Now, we can view and edit things in context. So here on iPad, we can see we have the Reality Composer app running, and our project loaded. We can simulate here in the View port, and we can see everything that we had set up on the Mac is working here on the iOS.Now, the power of working on iOS is that we can drop this into our world, and edit in real time. So let's go ahead and hit the AR affordance here in the corner.And we can target our table and drop our scene down.Now, with our scene placed in the real world, we can hit Play again, and we can view things in context. There goes our little police car. And here are our clouds coming and going. And there's our taxis moving.Now, let's say I'm not happy with the rate something is moving. I can simply hit Stop, and I can jump over to the Behaviors tab. Let's grab our taxi here. And we can change the rate at which things show up. And let's say on taxi number two, I would like it to show up a little bit slower. I can change that from two to five.And we're set. We can hit Play again. And we'll view that that takes now five seconds for it to show up at a very slow pace.Now, happy with the changes that we've made, we can stop simulating, drop out of AR mode, and simply hit Done in the top corner. That passes the project back to your Mac OS. Once back in Mac OS, we can view that our behavior is updated with the changes that we made on our iOS project. So if we navigate here, and see that that five-second change translated across for us smoothly. Once happy, you can save it and export it all out. Unzip rar free software.Let's hit one more topic before we export all this. I want to teach you how to make a splash page for your AR experience. Now, when using AR Quick Look, there is a load time while it syncs and anchors. While that's happening, you view a scene of your content. I'm going to teach you how to make a nice scene there for people to see while they wait for it to load. Now, the fastest way in Reality Composer to do that is to duplicate everything, and remove what you don't want seen. So let me show you how that works. Here, we can do a Select All.And Copy.Now, on our Scenes tab, we'll go ahead and make a new scene. So we'll go ahead and remove this one.We'll build a new one. How to draw a apple computer.There we go. Now, with a clean scene, we can just Paste, and all our geometry is coming from our other scene into this one.Now, what did not come along is the behaviors. So if we look at the behaviors, everything is gone. We simply have a static scene.And that's fine. That is what we want. So here, I'm going to give it one behavior. So let's go ahead and do a Custom, On Scene Start, and we're going to give it a Change Scene.We'll choose our completed scene, and we're good to go.So what this is doing is, when the scene is loading, this is what you view. But when it drops down on your table, and the scene actually starts, it immediately jumps over to our scene with the animated traffic, and the moving clouds. So this, in a sense, is making a splash page for people to look at while your content is loading.So while we're here, if we don't want to view certain things, like I don't want to view the taxi, or I don't want to view the clouds, you can simply remove them, and that comes off the splash page. But when we simulate, it immediately jumps to the page, that has all the actual behaviors and animations.Perfect. So our project is essentially done. Now, let's talk about exporting and sending to device.New to Reality Composer, we're going to give you the option to export out as a USDZ. Now, where before we were only giving you the Reality Export, this is a new option for you. that you'll need to enable in your preferences.So if we open our Preferences tab, you can see Enable USDZ Export as an option.Now, you're asking, 'What's the difference?' The difference is going to be contextual based on what you want to do with your project. So if you're looking to embed it in a website, or put it in a News article, you're going to want to use a USD export. Now, if you want to use it in an Xcode project, or you want to use it with AR Quick Look, you're going to want to stick with the Reality export. Now, I want to view this in AR Quick Look, so I'm going to stick with the Reality export.Let's do a File. Export, and we'll save this to disk as a Reality File.So with our new exported file, we can go ahead and AirDrop that onto our device for viewing and sharing. So let's do that.I can select my Reality export, I can right-click and do Share, AirDrop.Now, once our device comes up, we can send it to ourselves.Great. So we see it pop up here on our iPad. We can go ahead and accept it, and save it to our files. Now, with it saved in our files, it'll automatically launch in AR Quick Look, and drop it onto our desk.And once it loads, our town should be here.Perfect.So that's the overall workflow using Reality Converter and Reality Composer.We have seen how we can take geometries from a variety of DCCs and using Reality Converter make them into USDZs. We then took those USDZs into Composer and created an AR experience with simple behavior set ups. From that, we sent it all over to AR Quick Look to view, enjoy, and share. Now, this is all in service of building great AR experiences that are lightweight and easily shareable. So with these tools, you now can create even more compelling content for your experience.Moving forward from this talk, you might find yourself with questions, or wanting to know more on topics that I touched on.Here are some great resources to help you dive deeper on the topics we discussed. If you are new to Reality Composer, I encourage you to check out the Apple YouTube video that gives an overview of the software. Now, for Reality Converter, you can download the app from the developer's site and check out the documentation. And for Xcode projects that leverage Reality Composer you can view the samples under the RealityKit section.Thank you so much for taking the time to view my talk today. I really hope you found it fun, informative, and inspiring. So I look forward to seeing all the great content and experiences Reality Converter and Reality Composer will enable you to create. I am Philip Simmons, and this has been a look at the Artist AR Toolkit.
September 3rd, 2020 by Oleg Afonin
Category: «Elcomsoft News», «Mobile»
Category: «Elcomsoft News», «Mobile»
113 34 2 - 149
Last year, we have developed an innovative way to extract iPhone data without a jailbreak. The method’s numerous advantages were outweighed with a major drawback: an Apple ID enrolled in the paid Apple’s Developer program was required to sign the extraction binary. This is no longer an issue on Mac computers with the improved sideloading technique.
What’s this all about
When extracting an iOS device (pulling the file system and decrypting the keychain), one needs low-level access to the device. Traditionally, we’ve been using public jailbreaks for privilege escalation, yet recently we switched to a new method that does not require a jailbreak. Jailbreak-free extraction utilizes an Elcomsoft-developed extraction agent. Agent-based extraction provides tangible benefits over the traditional extraction method based on jailbreaking the device, being a safer, faster, and more robust alternative. Until today, agent-based extraction had a major drawback: in order to have the extraction binary signed, it required an Apple account registered in the Apple Developer program. We’ve circumvented this restriction in the latest release of iOS Forensic Toolkit for Mac. Users of the Windows edition still need the Developer account to perform the extraction.
A bit of history
Apple has a tight grip on the iOS ecosystem. In Apple dreams, everyone must pay for accessing the app ecosystem. Developers must purchase Apple hardware and Apple development tools; they must also purchase membership in the paid Apple Developer program. There is a separate recurring fee for publishing apps in the App Store, but even that fee does not guarantee the acceptance.
Users who wanted to install a non-approved app from a channel other than the official App Store had several choices. They could jailbreak the device and use one of the alternative app stores. They could pay Apple for the privilege by registering as a Developer. Finally, they could install a limited number of apps to their device and keep them for as long as 7 days without paying a dime. This process is called “sideloading”, and this is the same process that was used by forensic experts to install extraction software for imaging devices.
Historically, iOS users and forensic experts had been able to sideload third-party apps by using an ordinary, often throwaway Apple ID for signing the binaries. Cydia Impactor is a free tool often used for the purpose, but alternatives also exist. In November 2019, Apple made an abrupt change to their provisioning service, effectively blocking the sideloading mechanism for all but the users of a paid Apple Developer accounts. Saurik, the developer of Cydia Impactor, twitted about the issue. Since then, nothing but a paid Apple Developer account could be used to sign the binaries. Officially, this is still the case today. Unofficially, around the same time last year many users started having problems registering a personal Apple Developer account.
Developer or throwaway Apple ID for iPhone extraction?
A subscription to become a registered developer is affordable; it’s a tiny fraction of the cost of tools one needs to extract the iPhone. Using a developer account for sideloading the extraction software has tangible benefits over using a regular or anonymous (throwaway) Apple ID. We created a blog article explaining the benefits of the developer account compared to a throwaway Apple ID for the purpose of iOS extraction. In a word, utilizing an Apple account registered in the Developer program allows signing and sideloading apps, and bypassing the on-device certificate verification, which would otherwise require an Internet connection on the device with all the risks of exposing the device to remote lock/erase.
However, a large number of experts working for the law enforcement were and remain hesitant to obtain such accounts for various non-financial reasons. Registering for a personal Developer Account with Apple had become particularly challenging in the recent months, with Apple rejecting numerous applications with no explanation and no resolution through their support service. If you are seeing the “Your enrollment could not be completed” message, check out this thread for suggestions.
As a result, we’ve felt the urge to develop a working solution allowing experts to use regular or anonymous (throwaway) Apple IDs for signing the extraction software and performing the imaging.
Our work
We have discovered a way to enable the use of regular and disposable Apple ID’s for the purpose of agent-based data extraction. All you need is the latest build of iOS Forensic Toolkit, a Mac computer, and a cable to connect the iPhone to the Mac. In this guide, we’ll demonstrate how to image an iOS device with a disposable Apple ID.
Compatibility, pre-requisites and restrictions
In order to launch the attack, you will need all of the following.
- A compatible iPhone model running a supported version of iOS. The list of supported devices is available below. At this time, agent-based extraction is supported for all models ranging from the iPhone 5s through 11 Pro Max with iOS 9.0 through 13.5.
- A desktop or laptop computer with macOS 10.12 (Sierra) through 10.15 (Catalina).
- A Lightning cable.
- Apple ID (personal or disposable) with or without two-factor authentication.
- iOS Forensic Toolkit 6.50 or newer.
Apple Developer Toolkit App
Compared to using a Developer account, signing the extraction agent with a regular or throwaway Apple ID has the following restrictions.
- The signing certificate is only valid for 7 days. This is normally not an issue as extractions should performed on the same day as sideloading the agent.
- A non-developer Apple ID can be only used to sign a handful of devices (it was 3 devices when we last checked). This is why many experts prefer creating throwaway Apple IDs for device extractions.
- You will need to pass two-factor authentication when signing in. The 2FA code will be pushed onto a trusted device (no SMS delivery), so you’ll need to have one ready. Note that you will not be prompted for the code if the Mac is already trusted (e.g. it is tied to the Apple ID you are about to use). We have not tested the tool with non-2FA accounts.
- You will need to approve (Trust) the signing certificate on the iOS device. This is only possible when the device is connected to the Internet, so you’ll have to break the “Airplane mode only” rule to avoid the possible remote lock or erase command.
Steps to extract
Important: if you are performing a forensic extraction (as opposed to extracting your own iPhone), set up a restricted Internet connection first, as described in the following article: Setting Up Restricted Internet Connection.
- Download Elcomsoft iOS Forensic Toolkit.
- Install the toolkit by following the instructions in How to Install and Run iOS Forensic Toolkit on a Mac.
- Launch iOS Forensic Toolkit.
- Press 1 to sideload the agent onto the device. Note that you will have to pass two-factor authentication by entering the original Apple ID password (not an app-specific password as you previously would), and then one-time code (will be pushed to a trusted device).
- Verify the extraction agent on the device. This will be using Internet connection. Once you verify the extraction agent, launch it by tapping its app icon.
- Press 2 to extract and decrypt the keychain
- Press 3 to extract the file system image
- Press 4 to remove the extraction agent from the device
We strongly recommend extracting both the keychain and the file system as the content of the keychain could be used to decrypt certain app data (e.g. WhatsApp cloud backups, Signal and so on). The file system image can be analyzed in in Elcomsoft Phone Viewer or another forensic product.
Conclusion
Since November last year sideloading apps onto iOS devices had become more challenging. Apple made changes to its provisioning service, effectively breaking sideloading for all but the users of a paid Apple Developer account. We have discovered a way to enable the use of regular and disposable Apple ID’s for the purpose of agent-based data extraction. All you need is the latest build of iOS Forensic Toolkit, a Mac computer, and a cable to connect the iPhone to the Mac. In this guide, we’ll demonstrate how to image an iOS device with a disposable Apple ID.
iOS Forensic Toolkit for Mac circumvents the provisioning restriction that obliges users to use an Apple Developer account for imaging iOS devices. The Mac edition once again allows experts to use regular or throwaway Apple IDs for extracting the file system and decrypting the keychain from compatible iPhone and iPad devices. However, if one already has an Apple Developer account, we recommend continuing using that account to sideload the extraction binary due to the tangible benefits of this approach.
Apple Developer Toolkit Software
113 34 2 - 149