Build, test, and submit your app using Xcode, Apple's integrated development environment.

Xcode Documentation

Posts under Xcode subtopic

Post

Replies

Boosts

Views

Created

A Summary of the WWDC25 Group Lab - Developer Tools
At WWDC25 we launched a new type of Lab event for the developer community - Group Labs. A Group Lab is a panel Q&A designed for a large audience of developers. Group Labs are a unique opportunity for the community to submit questions directly to a panel of Apple engineers and designers. Here are the highlights from the WWDC25 Group Lab for Developer Tools. Will my project codebase be used for training when I use Xcode's intelligent assistant powered by cloud-based models? When using ChatGPT without logging in, your data will not be used to improve any models. If you log in to a ChatGPT account, this is based on your ChatGPT account settings, which allows you to opt-out (it defaults to on). When using Xcode with accounts for other model providers, you should check with the policies of your provider. And finally, at no point will any portion of your codebase be used to train or improve any Apple models. We'd love to make our SwiftUI Previews (and soon, Playgrounds) as snappy as possible. Is there any way to skip certain build steps, such as running linters? It seems the build environment is exactly the same (compared to a debug build), but maybe there's a trick. Starting with Xcode 16, SwiftUI previews use the exact same build artifacts as the regular build. The new Playgrounds support in Xcode 26 uses these build artifacts too. Shell script build phases are the most common thing that introduces extra build time, so as a first step, try turning off all shell script build phases (like linters) to get an idea if that’s the issue. If those build phases add significant time to your build, consider moving some of those phases into asynchronous steps, such as running linters before committing instead of on every build. If you do need a shell script build phase to run during your build, make sure to explicitly define the input and output files, as that is a huge way to improve your build performance. Are we able to provide additional context for the models, like coding standards? Documentation for third party dependencies? Documentation on your own codebase that explains things like architecture and more? In general, Xcode will automatically search for the right context based on the question and the evolving answer, as the model can interact multiple times with your project as it develops an answer. This will automatically pick up the coding style of the code it sees, and can include files that contain architecture comments, etc. Beyond automatic context, you can manually attach other documents, even if they aren't in your project. For example, you could make a file with rules and ideas and attach it, and it will influence the response. We are very aware of other kinds of automatic context like rule files, etc, though Xcode does not support these at this time. Once ChatGPT is enabled for Coding Intelligence in Xcode 26, and I sign into my existing ChatGPT account, will the ChatGPT Coding Intelligence model in Xcode know about chat conversations on Xcode development done previously in the ChatGPT Mac app? Xcode does not use information from other conversations, and conversations started in Xcode are not accessible in the web UI or ChatGPT app. Is there a plan to make SwiftUI views easier to locate and understand in the view hierarchy like UIKit views? SwiftUI uses a declarative paradigm to define your user interface. That allows you to specify what you want, with the system translating that into an efficient representation at runtime. Unlike traditional AppKit and UIKit, seeing the runtime representation of SwiftUI views isn't sufficient in order to understand why it's not doing what you want. This year, we introduced a SwiftUI Instrument that shows why things are happening, like view re-rendering. Is it possible to use the AI chat with ChatGPT Enterprise? My company doesn't allow us to use the general ChatGPT, only the enterprise version they have setup that prevents data from being leaked Yes, Xcode 26 supports logging into any existing ChatGPT account, including enterprise accounts. If that does not meet your needs, you can also setup a local server that implements the popular chat completions REST API to talk to your enterprise account how you need. Now that Icon Composer is here, how does it complement or replace existing vector design tools such as Sketch for icon design? Icon Composer complements your existing vector design tools. You should continue to create your shapes, gradients, and layers in another tool like Sketch, and compose the exported SVG layers in Icon Composer. Once you bring your layers into Icon Composer, you can then use it to influence the translucency, blur, and specular highlights for your icon. What’s one feature or improvement in the new Xcode that you personally think developers will love, but might not immediately discover? Maybe something tucked away or quietly powerful that’s flown under the radar so far? One feature we're particularly excited about is the new power profiler for iOS, which gives you further insights into the energy consumption of your app beyond what was possible with the energy instrument previously. You can learn more about how to use this instrument and how it can help you greatly reduce your apps battery usage in the documentation, as well as the session Profile and optimize power usage in your app. There were also improvements in accessibility this year with Voice Control, where you can naturally speak your Swift code to Xcode, and it understands the Swift syntax as you speak. To see it in action, take a look at the demonstration in What’s new in Xcode 26. We have a software advisory council that is very sensitive to having our private information going to the cloud in any form. What information do you have to help me guide Xcode and Apple Intelligence through the acceptance process? One thing you can do is configure a proxy for your enterprise that implementing the popular Chat Completions API endpoint protocol. When using a model provider via URL, you can use your proxy endpoint to inspect the network traffic for anything that you do not want sent outside of your enterprise, and then forward the traffic through the proxy to your chosen model provider. Are there list of recommended LLMs to use with Xcode via Intelligence/Local? I've tried Gemma3-12B, but.. I hope there are better options? Apple doesn't have a published list of recommended local models. This is a fast-moving space, and so a recommendation would become out of date very quickly as new models are released. We encourage you to try out the local model support in Xcode 26 with models that you find meet your needs, and let us and the community know! (continued below)
1
0
1.1k
Jul ’25
The xcode 26.3 project fails to compile after updating
0 0x10109152c __assert_rtn + 252 1 0x101006054 void std::__1::__introsort<std::__1::_ClassicAlgPolicy, ld::AliasAddressOrderer&, ld::Atom const**, false>(ld::Atom const**, ld::Atom const**, ld::AliasAddressOrderer&, std::__1::iterator_traits<ld::Atom const**>::difference_type, bool) + 0 2 0x1010059a4 ld::LayoutLinkedImage::assignAtomOffsetsInSection(ld::SectionLayout&, unsigned int, bool) + 680 3 0x10100a724 void dispatchForEach<ld::SectionLayout*, ld::LayoutLinkedImage::sortAtomsWithinSections()::$_0>(std::__1::span<ld::SectionLayout*, 18446744073709551615ul>, unsigned long, ld::LayoutLinkedImage::sortAtomsWithinSections()::$_0)::'lambda'(unsigned long)::operator()(unsigned long) const + 7852 4 0x180818aec _dispatch_client_callout2 + 16 5 0x1808137f8 _dispatch_apply_invoke3 + 336 6 0x180818ad4 _dispatch_client_callout + 16 7 0x180801a60 _dispatch_once_callout + 32 8 0x180812938 _dispatch_apply_invoke + 252 9 0x180818ad4 _dispatch_client_callout + 16 10 0x1808359dc _dispatch_channel_invoke.cold.5 + 32 11 0x18081113c _dispatch_root_queue_drain + 736 12 0x180811784 _dispatch_worker_thread2 + 180 13 0x1809b7e10 _pthread_wqthread + 232 ld snapshot written at /tmp/AIChatDoctor-2026-03-10-150254.ld-snapshot ld: Assertion failed: (aliasSectionNum == sectionNum && "alias and its target must be located in the same section"), function assignAliasAtomOffsetInSection, file Layout.cpp, line 4805. clang: error: linker command failed with exit code 1 (use -v to see invocation)
0
0
17
10h
No identity found: Command CodeSign failed with a nonzero exit code
Hello! I'm new to xcode and am developing an iOS app. I hit the following error when build my app: MY_SHA no identity found Command CodeSign failed with a nonzero exit code I have generated my certificate through xcode. I can see the Certificate and the private key in Keychain -> login. But it's not that private key is under the Certificate. They are listed in parallel. The certificate appears in Certificate tab but not in My Certificate Tab. Would that matter? security find-certificate -a -Z | grep -A1 "Apple Development" Gives me the certificate. But security find-identity -v -p codesigning give 0 valid identity found. Could you help me figure out what could be the reason for this? I've been stuck for several days now. Thank you so much!
0
0
9
12h
iPhone 17 Pro on iOS 26.3 stays unavailable in xcrun devicectl list devices even with Xcode 26.4 beta 2
I am unable to use my iPhone 17 Pro as a run destination in Xcode. The device appears at a low level, but CoreDevice / devicectl keeps reporting it as unavailable. Environment • Mac mini (Apple Silicon) • macOS 26.3 (Build 25D125) • iPhone 17 Pro • iOS 26.3 • Xcode 26.3 (Build 17C529) • Xcode 26.4 beta 2 (Build 17E5170d) Symptoms • The iPhone appears in Finder • On the iPhone, I tapped “Trust This Computer” • Developer Mode is enabled • Apple ID is added in Xcode Accounts • Team is configured in Signing & Capabilities • iOS Platform Support is installed • The device still does not become available as a run destination • Devices and Simulators does not show it in a usable state • xcrun devicectl list devices still shows unavailable Output of xcrun devicectl list devices Failed to load provisioning paramter list due to error: Error Domain=com.apple.dt.CoreDeviceError Code=1002 "No provider was found." UserInfo={NSLocalizedDescription=No provider was found.}. devicectl manage create may support a reduced set of arguments. Name Hostname Identifier State Model CAC1B56A-C3B4-59FF-8F9D-659277C7C76C.coredevice.local CAC1B56A-C3B4-59FF-8F9D-659277C7C76C unavailable iPhone18,1 What I already tried • Apple cable • Different USB/Thunderbolt ports • Direct connection to rear ports • Restarted both Mac and iPhone • Restarted Xcode • Ran sudo killall usbmuxd • Ran sudo pkill -9 remoted • Upgraded Xcode from 26.2 to 26.3, then to 26.4 beta 2 • Ran sudo xcode-select -s /Applications/Xcode-beta.app • Ran sudo xcodebuild -runFirstLaunch • Reset “Location & Privacy” on the iPhone, then trusted the Mac again • Still remains unavailable Additional note ioreg shows the iPhone, so the physical USB connection seems to exist. However, at the CoreDevice / devicectl layer the device remains unavailable, and Xcode cannot use it for on-device build/run. Is this a known issue with macOS 26.3 / iOS 26.3 / Xcode 26.4 beta 2? Any workaround or additional debugging steps would be appreciated.
0
0
29
17h
Lock Screen shows skip buttons instead of next/previous when using MPRemoteCommandCenter with AVPlayer playlist
Hello, I’m building an iOS video player using AVPlayer and a custom playback queue. I implemented remote controls using MPRemoteCommandCenter and enabled: nextTrackCommand previousTrackCommand playCommand pauseCommand I disabled: skipForwardCommand skipBackwardCommand seekForwardCommand seekBackwardCommand I also set queue metadata in MPNowPlayingInfoCenter: MPNowPlayingInfoPropertyPlaybackQueueIndex MPNowPlayingInfoPropertyPlaybackQueueCount Even with these commands enabled and the queue count greater than 1, the iOS lock screen continues to display the 10-second skip buttons instead of the previous/next track buttons. The commands themselves work correctly when triggered externally (Control Center, headphones, etc.), but the UI still shows the skip controls. Is there a way to force the lock screen UI to display previous / next track buttons for a video playlist? Or is this behavior expected when using AVPlayer with video content? Thanks.
0
0
9
21h
PencilKit PKCanvasView flicker
I have the following setup: UIViewRepresentable with a UIView holding UIScrollView ImageView PKCanvasView this is in order to make it appear as if you are writing on a paper which you can zoom and scroll. The UIImageView is holding the paper image. The problem is if you draw some pen strokes at zoom level 1, then zoom in (say zoom level 2) as soon as you place down the pen, the PKCanvasView starts flickering repeatedly very badly. A sample video is here: https://youtube.com/shorts/5zeq6EDheSM a small sample project showing this is here: https://www.icloud.com/iclouddrive/0a0NSLP4bsism69L7jRtZPIhQ#BugDemo
0
0
9
1d
iOS 26.2 Simulator is not available for download
I'm trying to download the iOS 26.2 simulator, but it fails both in Xcode and via the command line. When I run: xcodebuild -downloadPlatform iOS -buildVersion 26.2 -exportPath ~/Downloads It returns: iOS 26.2 is not available for download. I would like to know: Why iOS 26.2 simulator cannot be downloaded at the moment. Whether iOS 26.2 simulator is still available, or if it has been removed or replaced. How to properly get and install the iOS 26.2 simulator. Any official explanation or solution would be greatly appreciated. Thank you!
0
1
50
1d
Red "X" showing up for Certificates, Identifiers, & Profiles in xcode
I've upgraded to a new Macbook recently, just when I was setting up my Xcode, I realized I there is a this red "X" showing up next to my development team as I was signing in to my account I have checked my permission on App Store Connect, everything seemed fine. I have also deleted my old Apple development certificate and requested for another one. Nothing worked.
1
0
27
2d
Every Unity project with ARKit doesn't open in Testflight
Even when making a completely fresh Unity project, if I include ARKit and upload it to testflight, when I open it on my ipad it says 'Couldn't Load App'. If I don't include testflight the app opens fine. If I build the app directly to my ipad the app works fine. It opens if I set ARKit to 'optional' but then the AR doesn't work. I have included a camera and location usage description so it's not that. Really not sure what's going on, any help would be massively appreciated!
2
0
55
2d
Launch screen update
Hi. In my app a need do change launch screen. During development sometime in when I install new version of app over old version sometimes I see old launch screen. Is it possible that this situation will happen on production when I will publish app to App Store? Is Launch screen somehow cached?
0
0
22
4d
Xcode 26.3 not rendering Unicode/Emoji in Simulator or Canvas
I'm based in Australia, and am having the following issue. After leaving this message I received another response from someone else who is also havign the same issue, I've pasted that too. I'm quite surprised to not see it mentioned, could be local to language settings or something. Here is the issue: Folk, I am having an issue where Xcode 26.3 with the new iOS26.2+26.3.1 Simulator update installed doesn’t render Unicode characters in the Canvas or in Simulator. They do render when running direct to a device. I’ve reset devices inside Simulator, cleared Derived Data, cleaned Build Folders etc but to no avail! Is anyone else seeing this, I can’t find a mention anywhere!! Thank you in advance. Yep. Same here. Can't see emojis on the emoji picker keyboard.
4
8
309
4d
Request: Option to Position Coding Assistant on the Right Side in Xcode
Dear Xcode Team, Please consider adding an option to position the Coding Assistant on the right side of the editor. Currently, when the assistant panel is open, it makes it difficult to access the Project Navigator at the same time. Having the ability to move the Coding Assistant to the right side would allow developers to keep the Project Navigator visible while interacting with the assistant, improving workflow and efficiency. Thank you
0
3
35
5d
Xcode 26 references a local package twice for test
My project uses about 10 local packages. When Xcode determins its dependencies graph it includes project Foo (a package) and project Foo_Foo (???). This is a no op for normal builds, but causes the Foo package to be linked twice for test builds which then fail due to duplicate symbols. Nowhere in my repository is Foo_Foo defined or reference. Nowhere in the .xcodeproj/ folder is there mention of Foo_Foo. What am I doing wrong?
1
0
94
5d
Is it possible to download or copy the iOS SDK for Xcode 26.1 using command-line tools? If so, how?
I have Xcode 26.1.1 (installed on a Mac Mini with Sequoia 15.7.2), which was installed from the Universal .xip file available from the Apple Developer site. This file is 2.7Gb. When Xcode is run, there is a menu item at Settings -> Components, and the top section "Platform Support" includes an option to "Get" the iOS SDK, which is an additional 10.3Gb download.[1][2] I would like to either use command-line tools to obtain this iOS SDK component, or find where the downloaded package or its contents are installed on this Mac. This Apple Documentation some examples of using command-line tools to download components for Xcode. However, there is no specific documentation on how to separately download the iOS SDK. It seems that if possible, it could be done with "xcodebuild -downloadComponent", however I have tried a number of likely component names (ios, ios_sdk, ios26.1, ios_sdk26.1, iphoneos, iphoneos_sdk, iphoneos_sdk26.1), and none are valid. Does anyone know either where the SDK download can be found, or how to download it using command-line tools? Background: I require this as I am managing a fleet of Mac Minis located in a data centre (used for iOS/Android builds), and Internet access is restricted - any action requiring Internet access needs the Mac to be removed from the machine room racks and taken to a build room, then connected to a mobile hotspot or similar. This access is slow, and the above 10.3Gb download can take over an hour. If I can download the SDK separately, I can distribute it to all of the Macs over the internal network overnight, in advance of visiting the DC for upgrade tasks. [1] Various web sites say that the iOS SDK is "bundled with" Xcode. However, if that's the case a) how is a 10.3Gb download somehow included in a 2.7Gb file? And b) why do I still need to "Get" this component after installing Xcode from the 2.7Gb file? [2] Note that I am not looking for the iOS Simulator, I have already downloaded the iOS 26.1 version of this. I am specifically looking for the iOS SDK as listed under "Platform Support" in the Xcode menu described above.
2
0
49
6d
iOS 26.3.0 TextToSpeech EXC_BAD_ACCESS
Hello Apple Developer Community, I'm seeing a persistent crash in my iOS app reported via Firebase Crashlytics. The issue only started appearing on devices running iOS 26.3.0 and above (the crash does not occur on lower iOS versions, and it's unrelated to my app's version number). Key points: My app does NOT use any Text-to-Speech (TTS) features whatsoever. No AVSpeechSynthesizer, no Speech framework, no related APIs called from our code. My app is primarily written in Objective-C (with some Swift components possibly via dependencies). The crash stack is entirely within Apple's private TextToSpeech framework, specifically in ausdk::BufferAllocator::instance(). I suspect it might be indirectly triggered by a third-party ad SDK (e.g., Google Mobile Ads, AppLovin, etc.) that could be loading or interacting with accessibility features in the background — but this is just a hypothesis, as I have no direct evidence yet. Here is one representative crash log: Crashed: com.apple.root.user-initiated-qos.cooperative 0 TextToSpeech 0x6bb00 ausdk::BufferAllocator::instance() + 99800 1 TextToSpeech 0xf8c60 ausdk::BufferAllocator::instance() + 677688 2 TextToSpeech 0xf8c60 ausdk::BufferAllocator::instance() + 677688 3 TextToSpeech 0x1a0b9c ausdk::BufferAllocator::instance() + 1365620 4 libswift_Concurrency.dylib 0x628b4 swift::runJobInEstablishedExecutorContext(swift::Job*) + 288 5 libswift_Concurrency.dylib 0x63d28 swift_job_runImpl(swift::Job*, swift::SerialExecutorRef) + 156 6 libdispatch.dylib 0x13f48 _dispatch_root_queue_drain + 364 7 libdispatch.dylib 0x146fc _dispatch_worker_thread2 + 180 8 libsystem_pthread.dylib 0x137c _pthread_wqthread + 232 9 libsystem_pthread.dylib 0x8c0 start_wqthread + 8 The crash occurs on a background cooperative queue (Swift Concurrency). Questions: Has anyone else seen crashes inside ausdk::BufferAllocator::instance() in TextToSpeech on iOS 26.3.0+ even without using TTS in their app? Could a third-party ad SDK be causing the TextToSpeech framework to load unexpectedly (e.g., via accessibility preloading)? Is this a known bug in iOS 26's Spoken Content / Speak Selection features? Any workarounds or fixes from Apple? Any insights, similar reports (especially from Objective-C based apps), or advice would be greatly appreciated! Thanks!
0
1
63
6d