I am trying to use @AppIntent(schema: .system.search) to search in my app via a Siri voice command, but I want to be able to return a .result that does not open the app, yet still get the model training benefits from the schema.
Very new to this, this is my first app, so I would appreciate some guidance. I haven't gotten to the voice part, I tested on Shortcuts.
Do I need to do AppIntents without the schema and wait until there is a search schema that does not open the app, or should I be using a different schema? What am I missing?
Apple Intelligence
RSS for tagApple Intelligence is the personal intelligence system that puts powerful generative models right at the core of your iPhone, iPad, and Mac and powers incredible new features to help users communicate, work, and express themselves.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I have built a MAC-OS machine intelligence application that uses Apple Intelligence. A part of the application is to preprocess text. For longer text content I have implemented chunking to get around the token limit. However the application performance is now limited by the fact that Apple Intelligence is sequential in operation. This has a large impact on the application performance.
Is there any approach to operate Apple Intelligence in a parallel mode or even a streaming interface. As Apple Intelligence has Private Cloud Services I was hoping to be able to send multiple chunks in parallel as that would significantly improve performance.
Any suggestions would be welcome. This could also be considered a request for a future enhancement.
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
After running performance test on my CoreML qwen3 vision, I appreciated the update where results were viewable... ON Mac it mentions Ios18 and im not sure if or how to change..
that bottle neck lead to rebuilding CoreML view.
I woke up and realized I have all the pieces together... and ended up with a swift package working demo of Clawbot..
the current issue is Im trying to use gguf 3b to code it.. I have become well aware that everything I create using the big models, they soon become the default themes /layouts for everyone else simply asking for this or that (I appoligise)
so here I am asking (while looking to schedule meet with dev) if its possible to speak with anyone about th 1000s of Apple Intelligence PCC, Xcode, and vision reports and feedback ive sent , in terms of just general ways I can work more efficiently without the crash...
ive already build a TUI for MLX but the tools for coreML while seems promising are not intuitive, but the vision format instruction was nice to see.
Anyway my question is:
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Things I did:
created an Intents Extension target
added "Supported Intents" to both my main app target and the intent extension, with "INAddTasksIntent" and "INCreateNoteIntent"
created the AppIntentVocabulary in my main app target
created the handlers in the code in the Intents Extension target
class AddTaskIntentHandler: INExtension, INAddTasksIntentHandling {
func resolveTaskTitles(for intent: INAddTasksIntent) async -> [INSpeakableStringResolutionResult] {
if let taskTitles = intent.taskTitles {
return taskTitles.map { INSpeakableStringResolutionResult.success(with: $0) }
} else {
return [INSpeakableStringResolutionResult.needsValue()]
}
}
func handle(intent: INAddTasksIntent) async -> INAddTasksIntentResponse {
// my code to handle this...
let response = INAddTasksIntentResponse(code: .success, userActivity: nil)
response.addedTasks = tasksCreated.map {
INTask(
title: INSpeakableString(spokenPhrase: $0.name),
status: .notCompleted,
taskType: .completable,
spatialEventTrigger: nil,
temporalEventTrigger: intent.temporalEventTrigger,
createdDateComponents: DateHelper.localCalendar().dateComponents([.year, .month, .day, .minute, .hour], from: Date.now),
modifiedDateComponents: nil,
identifier: $0.id
)
}
return response
}
}
class AddItemIntentHandler: INExtension, INCreateNoteIntentHandling {
func resolveTitle(for intent: INCreateNoteIntent) async -> INSpeakableStringResolutionResult {
if let title = intent.title {
return INSpeakableStringResolutionResult.success(with: title)
} else {
return INSpeakableStringResolutionResult.needsValue()
}
}
func resolveGroupName(for intent: INCreateNoteIntent) async -> INSpeakableStringResolutionResult {
if let groupName = intent.groupName {
return INSpeakableStringResolutionResult.success(with: groupName)
} else {
return INSpeakableStringResolutionResult.needsValue()
}
}
func handle(intent: INCreateNoteIntent) async -> INCreateNoteIntentResponse {
do {
// my code for handling this...
let response = INCreateNoteIntentResponse(code: .success, userActivity: nil)
response.createdNote = INNote(
title: INSpeakableString(spokenPhrase: itemName),
contents: itemNote.map { [INTextNoteContent(text: $0)] } ?? [],
groupName: INSpeakableString(spokenPhrase: list.name),
createdDateComponents: DateHelper.localCalendar().dateComponents([.day, .month, .year, .hour, .minute], from: Date.now),
modifiedDateComponents: nil,
identifier: newItem.id
)
return response
} catch {
return INCreateNoteIntentResponse(code: .failure, userActivity: nil)
}
}
}
uninstalled my app
restarted my physical device and simulator
Yet, when I say "Remind me to buy dog food in Index" (Index is the name of my app), as stated in the examples of INAddTasksIntent, Siri proceeds to say that a list named "Index" doesn't exist in apple Reminders app, instead of processing the request in my app.
Am I missing something?
Whats to code to warm it up once? Saw this in a developer video but cannot find it. Prevent cold run within an application.
Thank you in advance!
We're in the process of migrating our app's custom intents from the older SiriKit Custom Intents framework to App Intents. The migration has been straightforward for our app-specific actions, and we appreciate the improved discoverability and Apple Intelligence integration that App Intents provides.
However, we also implement SiriKit domain intents for calling and messaging:
INStartCallIntent / INStartCallIntentHandling
INSendMessageIntent / INSendMessageIntentHandling
These require us to maintain an Intents Extension to handle contact resolution and the actual call/message operations.
Our questions:
Is there a planned App Intents equivalent for these SiriKit domains (calling, messaging), or is the Intents Extension approach still the recommended path?
If we want to support phrases like "Call [contact] on [AppName]" or "Send a message to [contact] on [AppName]" with Apple Intelligence integration, is there any way to achieve this with App Intents today?
Are there any WWDC sessions or documentation we may have missed that addresses the migration path for SiriKit domain intents?
What we've reviewed:
"Migrate custom intents to App Intents" Tech Talk
"Bring your app's core features to users with App Intents" (WWDC24)
App Intents documentation
These resources clearly explain custom intent migration but don't seem to address the system domain intents.
Our current understanding:
Based on our research, it appears SiriKit domain intents should remain on the older framework, while custom intents should migrate to App Intents. We'd like to confirm this is correct and understand if there's a future direction we should be planning for.
Thank you!
Hi everyone,
I’m currently using macOS Version 15.3 Beta (24D5034f), and I’m encountering an issue with Apple Intelligence. The image generation tools seem to work fine, but everything else shows a message saying that it’s “not available at this time.”
I’ve tried restarting my Mac and double-checked my settings, but the problem persists. Is anyone else experiencing this issue on the beta version? Are there any fixes or settings I might be overlooking?
Any help or insights would be greatly appreciated!
Thanks in advance!
I am writing an app that parses text and conducts some actions. I don't want to give too much away ;)
However, I am having a huge problem with token sizes. LanguageModelSession will of course give me the on device model 4096 available, but when you go over 4096, my code doesn't seem to be falling back to PCC, or even the system configured ChatGPT. Can anyone assist me with this? For some reason, after reading the docs, it's very unclear how this transition between the three takes place.
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Is Private Cloud Compute allowed? Or are on-device Foundational Models allowed only? Thanks!
My app lets you create images with Image Playground. When the user approves an image I move it to the documents dir from the temp storage. With over a year of usage I’ve created a lot of images over time.
Out of nowhere the app stopped loading my custom creations from Image Playground saying it couldn’t find the files. It still had my VoiceOver strings I had added for each image and still had the custom categories I assigned them.
Debug code to look in the docs dir doesn’t find them. I downloaded the app’s container and only see the images I created as a test after the problem started.
But my ~70MB app is still taking up 300MB on my iPhone so it feels like they’re there but not accessible.
Is there anything else I can try?
Hi,
I am modifying the sample camera app that is here: https://developer.apple.com/tutorials/sample-apps/capturingphotos-camerapreview ... In the processPreviewImages, I am using the Vision APIs to generate a segmentation mask for a person/object, then compositing that person onto a different background (with some other filtering). The filtering and compositing is done via CoreImage. At the end, I convert the CIImage to a CGImage then to a SwiftUI Image. When I run it on my iPhone, it works fine, and has not crashed. When I run it on the iPhone with the debugger, it crashes within a few seconds with:
EXC_BAD_ACCESS in libRPAC.dylib`std::__1::__hash_table<std::__1::__hash_value_type<long, qos_info_t>, std::__1::__unordered_map_hasher<long, std::__1::__hash_value_type<long, qos_info_t>, std::__1::hash, std::__1::equal_to, true>, std::__1::__unordered_map_equal<long, std::__1::__hash_value_type<long, qos_info_t>, std::__1::equal_to, std::__1::hash, true>, std::__1::allocator<std::__1::__hash_value_type<long, qos_info_t>>>::__emplace_unique_key_args<long, std::__1::piecewise_construct_t const&, std::__1::tuple<long const&>, std::__1::tuple<>>:
It had previously been working fine with the debugger, so I'm not sure what has changed. Is there a difference in how the Vision APIs are executed if the debugger is attached vs. not?
I have recently been having trouble with my iOS 18.2 beta update. It has been 2 weeks since I have updated to iOS 18.2 beta and joined the Genmoji and image playground waitlist. I am wondering how much longer I have to wait till my request is approved.
Got new iPhone Boxing Day all works bar image playground uninstalled/reinstalled turns ai on/off still stuck
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
We are developing Apple AI for foreign markets and adapting it for iPhone models 17 and above.
When the system language and Siri language are not the same—for example, if the system is in English and Siri is in Chinese—it can cause a situation where Apple AI cannot be used. So, may I ask if there are any other reasons that could cause Apple AI to be unavailable within the app, even if it has been enabled?
When the system language and Siri language are not the same, Apple AI may not be usable.
For example, if the system is in English and Siri is in Chinese, it may cause Apple AI to not work.
May I ask if there are other reasons why the app still cannot be used internally even after enabling Apple AI?
Apple's Image Playground primarily performs image generation on-device, but can use secure Private Cloud Compute for more complex requests that require larger models. Private Cloud Compute (PCC)
For more complex tasks that require greater computational power than the device can provide, Image Playground leverages Apple's Private Cloud Compute. This system extends the privacy and security of the device to the cloud:
Secure Environment: PCC runs on Apple silicon servers and uses a secure enclave to protect data, ensuring requests are processed in a verified, secure environment.
No Data Storage: Data is never stored or made accessible to Apple when using PCC; it is used only to fulfill the specific request.
Independent Verification: Independent experts are able to inspect the code running on these servers to verify Apple's privacy promises.
Itself been 4-5 days my Image playground has showing the “Downloading Support for Image Playground “
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
Hello,
Are there any plans to compile a python 3.13 version of tensorflow-metal?
Just got my new Mac mini and the automatically installed version of python installed by brew is python 3.13 and while if I was in a hurry, I could manage to get python 3.12 installed and use the corresponding tensorflow-metal version but I'm not in a hurry.
Many thanks,
Alan
Hi, guys. I'm writing about Apple Intelligence and I reached the point I have to explain App Intent Domains
https://developer.apple.com/documentation/AppIntents/app-intent-domains
but I noticed that there is a note explaining that these services are not available with Siri. I tried the example provided by Apple at
https://developer.apple.com/documentation/AppIntents/making-your-app-s-functionality-available-to-siri
and I can only make the intents work from the Shortcuts App, but not from Siri.
Is this correct. App Intent Domains are still not available with Siri?
Thanks
Topic:
Machine Learning & AI
SubTopic:
Apple Intelligence
My app used app intents. And when user said "Prüfung der Bluetooth Funktion", screen can show the whole words. But in my app, it only can get "Bluetooth Funktion". This behaviour only happened in German version. In English version, everything worked well.
Is anyone can support me? Why German version siri cut my words?