Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics
Posts under Media Technologies topic

Post

Replies

Boosts

Views

Activity

Creating RTP-MIDI Sessions via MIDINetworkSession C API (dlopen/dlsym) on macOS 15?
I’m an amateur developer working on a free utility for composers/producers, for which the macOS release needs to create and name RTP-MIDI sessions in Audio MIDI Setup from the command line (so I can ship a small C helper instead of telling users to click through the UI). Here’s what I’ve tried so far, without luck: • Plist hacks: Injecting entries into ~/Library/Audio/MIDI Configurations/*.mcfg works when AMS is closed, but AMS immediately locks and reverts my changes when it’s open. • CoreMIDI C API: I can create virtual ports with MIDISourceCreate, but attempting MIDIObjectGetDataProperty on the apple.midirtp.session plugin always returns err –10836. • Obj-C & Swift: Loading MIDINetworkSession and calling defaultSession, init, setNetworkName: and setting enabled = YES doesn’t produce a new session object in the Network panel. • dlopen/dlsym: I extracted the real CoreMIDI binary out of the dyld shared cache and tried binding _MIDINetworkSessionCreate, _SetName, _SetEnabled, etc., but all the symbols come back null or my tool segfaults. • Plugin registration: I’ve pulled the factory UUID (70C9C5EA-7C65-11D8-B317-000393A34B5A) from /System/Library/Extensions/AppleMIDIRTPDriver.plugin/Contents/Info.plist and called CFPlugInRegisterFactories, but it still never exposes the session-creation calls. At this point I’m convinced I’m either loading the wrong binary or missing one critical step in registering the RTP-MIDI plugin’s private API. Can anyone point me to: The exact path of the dylib or bundle that actually exports the MIDINetworkSessionCreate/MIDINetworkSessionSetName/MIDINetworkSessionSetEnabled symbols? A minimal working snippet (C or Obj-C) that reliably creates and names a Network-MIDI session? Any pointers, sample code, or even ideas about where Apple hides this functionality on macOS 15 would be hugely appreciated. Thanks!
0
1
196
Jun ’25
Seeking Expert Clarification on MusicKit Usage and Compliance for a App in Saudi Arabia
Hello Apple Developer Community, We are developing a music management platform for restaurants and cafes in Saudi Arabia. Our app enables businesses to schedule playlists and allows visitors to request songs via barcodes. Music playback is powered by Apple Music, and users must have their own Apple Music subscriptions to access the music. Our service charges a monthly subscription fee for these management features, not for music access itself. Project Overview and MusicKit Role Our app integrates MusicKit to leverage Apple Music’s catalog and playback capabilities. Users log in with their Apple Music accounts, ensuring they have an active subscription for music playback. Our platform’s value lies in its tools—playlist scheduling and song requests—which are built on top of MusicKit’s APIs. We offer these features exclusively in Saudi Arabia. Legal Context in Saudi Arabia In Saudi Arabia, to our understanding, no special licenses are required for playing music in commercial venues like restaurants and cafes. This means our clients can use Apple Music subscriptions for playback without additional performance rights licenses. While this aligns with local laws, we recognize that Apple’s global policies may impose stricter requirements, prompting our need for clarification. Subscription Model and Monetization Concerns We charge a monthly subscription fee for access to our app’s features (e.g., scheduling playlists and managing song requests). This fee is separate from the Apple Music subscription, which users must maintain for playback. However, Apple’s MusicKit terms state: "You agree not to require payment for or indirectly monetize access to the Apple Music service." We’re concerned whether our subscription model might be interpreted as indirectly monetizing Apple Music access, given its reliance on MusicKit for functionality. Scheduling Feature and Synchronization Rights Our app allows businesses to schedule playlists for general time slots (e.g., “play this playlist from 6 PM to 8 PM”). It does not support precise scheduling, such as playing a specific song at an exact moment (e.g., “play this song at 7:30 PM”). Apple’s guidelines mention that “deeper or more complex music integration” may require additional licenses, like synchronization rights. We’re unsure if our general scheduling feature crosses this threshold or remains within MusicKit’s standard usage. Questions for Clarification We’d greatly appreciate expert input on the following: Monetization: Does our subscription fee for management features (scheduling and song requests) violate Apple’s policy against indirectly monetizing Apple Music access? Local Context: Given that Saudi Arabia requires no additional licenses for commercial music playback, does this impact our compliance with Apple’s global terms? Scheduling: Does our playlist scheduling for general time slots (not exact moments) fall within MusicKit’s permitted scope, or does it require further licensing? Thank you in advance for any insights or guidance to ensure our app aligns with Apple’s policies!
0
0
379
Mar ’25
Unstable Playlist.Entry.id causes crashes when removing duplicates
When multiple identical songs are added to a playlist, Playlist.Entry.id uses a suffix-based identifier (e.g. songID_0, songID_1, etc.). Removing one entry causes others to shift, changing their .id values. This leads to diffing errors and collection view crashes in SwiftUI or UIKit when entries are updated. Steps to Reproduce: Add the same song to a playlist multiple times. Observe .id.rawValue of entries (e.g. i.SONGID_0, i.SONGID_1). Remove one entry. Fetch playlist again — note the other IDs have shifted. FB18879062
0
0
548
Jul ’25
MusicKit - Skipping Forwards or Backwards does not update
Hello everyone, I am working on an app that allows you to review your own music using Apple Music. Currently I am running into an issue with the skipping forwards and backwards outside of the app. How it should work: When skipping forward or backwards on the lock or home screen of an iPhone, the next or previous song on an album should play and the information should change to reflect that in the app. If you play a song in Apple Music, you can see a Now Playing view in the lock screen. When you skip forward or backwards, it will do either action and it would reflect that when you see a little frequency icon on artwork image of a song. What it's doing: When skipping forward or backwards on the lock or home screen of an iPhone, the next or previous song is reflected outside of the app, but not in the app. When skipping a song outside of the app, it works correctly to head to the next song. But when I return to the app, it is not reflected NOTE: I am not using MusicKit variables such as Track, Album to display the songs. Since I want to grab the songs and review them I need a rating so I created my own that grabs the MusicItemID, name, artist(s), etc. NOTE: I am using ApplicationMusicPlayer.shared Is there a way to get the song to reflect in my app? (If its easier, a simple example of it would be nice. No need to create an entire xprod file)
0
0
99
Apr ’25
VTDecompressionSession consistently drops frames after sync frame
I'm seeking to a specific sync frame in a video file (HEVC, recorded on iPad). When I feed the buffers from that sync frame on to VTDecompressionSession it consistently drops the 2.,3.,4. buffer with a kVTVideoDecoderReferenceMissingErr (or no error but no buffer on the simulator). If I feed all the buffers from the penultimate sync frame prior to the desired frame the buffers come out fine but that would just create a massive overhead to always do it. Tried multiple OS versions, devices etc. Seems a consistent problem. Here's a sample project with the offending video (disregard memory handling etc): https://github.com/marcuseckert/vtSample I've filed a radar FB18228296 but would appreciate any feedback on circumventing or at least detecting this behavior prior to decoding.
0
0
135
Jun ’25
About the built-in instrument sound of Apple devices
Does anyone know how to pronounce the sound of a specific instrument when you tap a button on the screen on your iPhone or iPad? Now, in the middle of creating a music learning app, I'm thinking of assigning monotones or chords to the button-like frames on the keyboard and fingerboard on the screen. Can it be achieved with SwiftUI chords alone? Once upon a time, MIDI level 1 I remember that there was a pronunciation function of the instrument, but I don't think about implementing the same function in the current OS. Please lend me your wisdom.
0
0
60
May ’25
How to override the default USB video
According to the doc, I did a simple demo to verify. My env: ProductName: macOS ProductVersion: 15.5 BuildVersion: 24F74 2.4 GHz 四核Intel Core i5 Info.plist: <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd"> <plist version="1.0"> <dict> <key>IOKitPersonalities</key> <dict> <key>UVCamera</key> <dict> <key>CFBundleIdentifierKernel</key> <string>com.apple.kpi.iokit</string> <key>IOClass</key> <string>IOUserService</string> <key>IOMatchCategory</key> <string>$(PRODUCT_BUNDLE_IDENTIFIER)</string> <key>IOProviderClass</key> <string>IOUserResources</string> <key>IOResourceMatch</key> <string>IOKit</string> <key>IOUserClass</key> <string>UVCamera</string> <key>IOUserServerName</key> <string>$(PRODUCT_BUNDLE_IDENTIFIER)</string> <key>IOProbeScore</key> <integer>100000</integer> <key>idVendor</key> <integer>1452</integer> <key>idProduct</key> <integer>34068</integer> </dict> </dict> <key>OSBundleUsageDescription</key> <string></string> </dict> </plist> UVCamera.cpp // // UVCamera.cpp // UVCamera // // Created by DTEN on 2025/6/12. // #include <os/log.h> #include <DriverKit/IOUserServer.h> #include <DriverKit/IOLib.h> #include "UVCamera.h" kern_return_t IMPL(UVCamera, Start) { kern_return_t ret; ret = Start(provider, SUPERDISPATCH); os_log(OS_LOG_DEFAULT, "Hello World"); return ret; } UVCamera.iig // // UVCamera.iig // UVCamera // // Created by DTEN on 2025/6/12. // #ifndef UVCamera_h #define UVCamera_h #include <Availability.h> #include <DriverKit/IOService.iig> class UVCamera: public IOService { public: virtual kern_return_t Start(IOService * provider) override; }; #endif /* UVCamera_h */ Then I build by xcode and mv it to /Library/DriverExtensions: sudo mv com.lqs.MyVirtualCam.UVCamera.dext /Library/DriverExtensions sudo kmutil install -R / -r /Library/DriverExtensions kmutil rebuild done However,the dext can't be loaded: kmutil showloaded --list-only | grep UVCamera No variant specified, falling back to release What's the problem? anyone can help me?
0
0
207
Jun ’25
Playing FairPlay encrypted content works fine on ios17 but won't play on ios26
For devices that are still on ios17, playing Fairplay encrypted content still works fine. For devices that I've upgraded to ios26 playing the same content in the same app no longer works. I can advance and see the stream frames by tapping +10 scrubbing so I know that the content is being decrypted but tapping the play button of AVPlayer for an AVPlayerItem now does nothing in ios26. Is this a breaking change or is there a stricter requirement that I now have to implement?
0
0
190
Oct ’25
How to match music with shazamkit for Android ?
Hi all, i can successfully match music using shazamkit on Apple using SwiftUI, a simple app that let user to load an audio file and exctracts the relative match, while i am unable to match music using shamzamkit on Android. I am trying to make the same simple app but i cannot match music as i get MATCH_ATTEMPT_FAILED every time i try to. I don't know what i am doing wrong but the shazam part in the kotlin Android code is in this method : suspend fun processAudioFileInBackground( filePath: String, developerTokenProvider: DeveloperTokenProvider ) = withContext(Dispatchers.IO) { val bufferSize = 1024 * 1024 val audioFile = FileInputStream(filePath) val byteBuffer = ByteBuffer.allocate(bufferSize) byteBuffer.order(ByteOrder.LITTLE_ENDIAN) var bytesRead: Int while (audioFile.read(byteBuffer.array()).also { bytesRead = it } != -1) { val signatureGenerator = (ShazamKit.createSignatureGenerator(AudioSampleRateInHz.SAMPLE_RATE_44100) as ShazamKitResult.Success).data signatureGenerator.append(byteBuffer.array(), bytesRead, System.currentTimeMillis()) val signature = signatureGenerator.generateSignature() println("Signature: ${signature.durationInMs}") val catalog = ShazamKit.createShazamCatalog(developerTokenProvider, Locale.ENGLISH) val session = (ShazamKit.createSession(catalog) as ShazamKitResult.Success).data val matchResult = session.match(signature) println("MatchResult : $matchResult") setMatchResult(matchResult) byteBuffer.clear() } audioFile.close() } I noticed that changing Locale in catalog creation results in different result as i get NoMatch without exception. Can you please help me with this?
0
0
92
Apr ’25
How to record voice, auto-transcribe, translate (auto-detect input language), and play back translated audio on same device in iOS Swift?
Hi everyone 👋 I’m building an iOS app in Swift where I want to do the following: Record the user’s voice Transcribe the spoken sentence (speech-to-text) Auto-detect the spoken language Translate it to another language selected by the user (e.g., English → Spanish or Hindi → English) Speak back (text-to-speech) the translated text on the same device Is this possible to record via phone mic and play the transcribe voice into headphone's audio?
0
0
275
Oct ’25
Creating an initial Now Playing state of paused - impossible?
I am working on an app which plays audio - https://youtu.be/VbAfUk_eYl0?si=nJg5ayy2faWE78-g - and one of the features is, on restart, if you had paused playback of a file at the time the app was previously shut down (or were playing one at the time of shutdown), the paused state and position in the file is restored exactly as it was, on restart. The functionality works. However, it seems impossible to get the "now playing" information in iOS into the right state to reflect that via the MediaPlayer API. On restart, handlers are attached to the play/pause/togglePlayPause actions on MPRemoteCommandCenter.shared(), and the map of media info is updated on MPNowPlayingInfoCenter.default().nowPlayingInfo. What happens is that iOS's media view shows the audio as playing and offers a pause button - even though the play action is enabled and the pause action is disabled. Once playback has been initiated (my workaround is to have the pause action toggle the play state, since otherwise you wouldn't be able to initiate playback from controls in a car without initiating it once from a device first). I've created a simplified white-noise-player demo to illustrate the problem - simply build and deploy it, and then start the app, lock your device and look at the playback controls on the lock screen. It will show a pause button - same behavior I've described. https://github.com/timboudreau/ios-play-pause-demo I've tried a few things to narrow down the source of the issue - for example, thinking that not MPNowPlayingInfoPropertyPlaybackProgress and MPMediaItemPropertyPlaybackDuration might be the culprit (since the system interpolates elapsed time and it's recommended to update those properties infrequently) on startup might do the trick, but the result is the same, just without a duration or progress shown. What governs this behavior, and is there some way to explicitly tell the media player API your current state is paused?
0
2
163
Apr ’25
The files generated using AVAudioRecorder have a constant size of only 4kb
Hello. My app uses AVAudioRecorder to generate recording files, which are consistently only 4kb in size. Most users generate audio files normally, with only a few users experiencing this phenomenon occasionally. After uninstalling and installing the app, it will work normally, but it will reappear after a period of time. I have compared that the problematic audio files generated each time are fixed and cannot be played. Added the audioRecorderDidFinishRecording proxy method, which shows that the recording was completed normally. The user also reported that the recording is normal, but there is a problem with the generated file. How should I handle this issue? Look forward to your reply. - (void)startRecordWithOrderID:(NSString *)orderID { AVAudioSession *audioSession = [AVAudioSession sharedInstance]; [audioSession setCategory:AVAudioSessionCategoryRecord error:nil]; [audioSession setActive:YES error:nil]; NSMutableDictionary *settings = [[NSMutableDictionary alloc] init]; [settings setObject:[NSNumber numberWithFloat: 8000.0] forKey:AVSampleRateKey]; [settings setObject:[NSNumber numberWithInt: kAudioFormatLinearPCM] forKey:AVFormatIDKey]; [settings setObject:[NSNumber numberWithInt:16] forKey:AVLinearPCMBitDepthKey]; [settings setObject:[NSNumber numberWithInt: 1] forKey:AVNumberOfChannelsKey]; [settings setObject:[NSNumber numberWithBool:NO] forKey:AVLinearPCMIsBigEndianKey]; [settings setObject:[NSNumber numberWithBool:NO] forKey:AVLinearPCMIsFloatKey]; NSString *path = [WDUtility createDirInDocument:@"audios" withOrderID:orderID withPathExtension:@"wav"]; NSURL *tmpFile = [NSURL fileURLWithPath:path]; recorder = [[AVAudioRecorder alloc] initWithURL:tmpFile settings:settings error:nil]; [recorder setDelegate:self]; [recorder prepareToRecord]; [recorder record]; }
0
0
229
Jul ’25
how to enumerate build-in media devices?
Hello, I need to enumerate built-in media devices (cameras, microphones, etc.). For this purpose, I am using the CoreAudio and CoreMediaIO frameworks. According to the table 'Daemon-Safe Frameworks' in Apple’s TN2083, CoreAudio is daemon-safe. However, the documentation does not mention CoreMediaIO. Can CoreMediaIO be used in a daemon? If not, are there any documented alternatives to detect built-in cameras in a daemon (e.g., via device classes in IOKit)? Thank you in advance, Pavel
0
0
159
May ’25
Disabling Hardware OIS via AVFoundation — Clarification on AVCaptureVideoStabilizationMode
Hello everyone, I'm looking for a definitive clarification on how to completely disable all video stabilization, including the hardware OIS, using AVFoundation. The goal is to achieve a completely raw, unstabilized video feed, which is crucial when using external equipment like gimbals to avoid conflicting stabilization motions. My research points to using the AVCaptureConnection property preferredVideoStabilizationMode and setting it to AVCaptureVideoStabilizationMode.off. The documentation for the .off case states: A mode that doesn’t stabilize video capture. This description is slightly ambiguous. It's unclear whether this only affects software-level stabilization (EIS, EIS+OIS, etc) or if it guarantees the complete deactivation of the physical OIS module. For professional video applications, this is a critical distinction. So, I'd like to ask the community: Has anyone been able to definitively confirm that setting preferredVideoStabilizationMode to .off also disables the hardware OIS? Are there any known tests or documentation that prove this behavior? Is there an alternative or more direct method to ensure the OIS module is physically inactive during video capture? What is the community's best practice for ensuring absolutely no stabilization is applied to the video pipeline? Any insights or shared experiences on this topic would be greatly appreciated. Thank you!
0
1
303
Sep ’25
Play Audio and Recognize Speech in Car
Hello, I'm trying to determine the best/recommended AVAudioSession configuration (i.e category, mode, and options) for the following use-case. Essentially, I'd like to switch between periods of playing an audio file and then recognizing speech. The audio file is typically speech and I don't intend for playback and speech recognition to occur simultaneously. I'd like for the user to sill be able to interact with Siri and I'd like for it to work with CarPlay where navigation prompts can occur. I would assume the category to use is 'playAndRecord', but I'm not sure if it's better to just set that once for the entire lifecycle, or set to 'playback' for audio file playback and then switch to 'playAndRecord' for speech recognition . I'm also not sure on the best 'mode' and 'options' to set. Any suggestions would be appreciated. Thanks.
0
0
591
Sep ’25
401 Unauthorized when attempting to access Apple Music Feed API
Hello, I am trying to access the Apple Music Feed API, but I am recieving a 401 Unauthorized error message whenever I try to access it. I have tried using my own code to generate a JWT and directly call the API (which can call the standard Apple Music API successfully). > GET /v1/feed/song/latest HTTP/2 > Host: api.media.apple.com > user-agent: insomnia/2023.5.8 > authorization: Bearer [REDACTED] > accept: */* < HTTP/2 401 < content-type: application/json; charset=utf-8 < content-length: 0 < x-apple-jingle-correlation-key: AV5IOHBNM2UUJVOFQ4HZ2TGF6Q < x-daiquiri-instance: daiquiri:10001:daiquiri-all-shared-ext-7bb7c9b9bb-r459v:7987:25RELEASE91:daiquiri-amp-kubernetes-shared-ext-ak8s-prod-pv4-amp-daiquiri-ingress-prod and also the Apple provided Python example code, which gives me authentication errors too. $ python3 ./apple_music_feed_example.py --key-id NMBH[...] --team-id 3TNZ[...] --secret-key-file-path "/Users/foxt/Documents/am-feed/NMBH[...].p8" --out-dir . running.... INFO:__main__:Sending requests to https://api.media.apple.com INFO:__main__:Getting the latest export for feed artist Exception: Authentication Failed. Did you provide the correct team id, key id, and p8 file? Does this API need to be enabled on my account separately from the main Apple Music API? The documentation reads to me as if anyone with an Apple Developer Programme membership can use this API and I did not see any information regarding any other requirements
0
0
439
Sep ’25
MusicKit developer token issue
I'm reaching out regarding a recurring issue I'm experiencing with MusicKit developer tokens. I'm using a valid .p8 private key to sign JWTs for Apple MusicKit integration. Each token I generate includes the appropriate claims (iss, iat, exp) and is signed with the ES256 algorithm, with an expiration date set approximately 6 months ahead. Everything works as expected immediately after generating the token. However, after a few days, the same JWT (still well within its expiration period) suddenly begins returning invalid/unauthorized responses when used in Postman and other API clients. Importantly: I did not delete or revoke the .p8 key during this time. I verified the JWT contains valid claims and a proper structure. The issue consistently resolves only when I create a new .p8 file and regenerate a fresh JWT with it—after which the cycle repeats. This issue occurs even when the environment and app identifiers remain unchanged. I would greatly appreciate it if you could help me understand: Why these tokens become invalid after a few days, despite having a long exp value and an unchanged key. Whether there's any automatic revocation or timeout policy on .p8 keys that could explain this behavior. If there's a better way to maintain long-lived developer tokens without requiring new .p8 key generation every few days. Thank you for your help and clarification on this issue. Best regards, Liad Altif
0
0
149
Jun ’25
When deleting photos, encountered PHPhotosError.operationInterrupted (3301).
Hi, I’ve developed a photo app that includes a photo deletion feature. Some users have reported encountering PHPhotosError.operationInterrupted (3301) when attempting to delete photos. Initially, I suspected that some of the assets might have a sourceType of typeiTunesSynced, since the documentation notes that iTunes-synced assets cannot be edited or deleted. However, after checking the logs, all of the assets involved are of typeUserLibrary. Additionally, the user mentioned that some photos in the iPhone Photos do not show a delete button. I’m unsure whether the absence of the delete button is related to the 3301 error. I’d like to confirm the following: Under what conditions does PHPhotosError.operationInterrupted (3301) occur, and how should it be handled? Why do some photos in the iPhone Photos not show a delete button? The code for deleting photos is as follows: PHPhotoLibrary *library = [PHPhotoLibrary sharedPhotoLibrary]; [library performChanges:^{ PHFetchResult *assetsToBeDeleted = [PHAsset fetchAssetsWithLocalIdentifiers:delUrls options:nil]; if (assetsToBeDeleted) { [PHAssetChangeRequest deleteAssets:assetsToBeDeleted]; } } completionHandler:^(BOOL success, NSError *error) {
0
0
125
Jun ’25