Explore best practices for creating inclusive apps that cater to users with diverse abilities

Learn More

Posts under General subtopic

Post

Replies

Boosts

Views

Activity

VoiceOver is not respecting lang in HTML option
I have an HTML select that has Spanish text in the options. When VoiceOver reads the selected option (unopened), it switches to Spanish as expected. However, when you open the select box and browse through the options, it uses the English voice to read the Spanish text. I have tried adding lang on to the select tag and the option tag but neither helps https://codepen.io/grahamfowles/pen/VYYRxMK
0
0
141
May ’25
Handling VoiceOver Focus When Screen Changes (Push, Present, and SplitViewController)
I have some doubts about how VoiceOver handles focus when the screen updates. When a new UIViewController is pushed onto a UINavigationController or presented modally, how does VoiceOver decide which element to focus on? Is there a way to control or customize this behavior? In a UISplitViewController, when an item is selected in the primary view controller, the focus should shift to the relevant content in the secondary view controller. How can we ensure that VoiceOver correctly moves focus to the right element in the secondary panel?
0
0
163
Apr ’25
ApplePay Merchant Session - Could not create SSL/TLS secure channel issue
As part of apple pay implementation we are trying to create a merchant session by trying to connect to apple endpoint https://apple-pay-gateway-cert.apple.com/paymentservices/startSession. While trying to do so we are facing an error “An error occurred while sending the request. The request was aborted: Could not create SSL/TLS secure channel.” . I call the validation url by passing to a C# .Net Framework 4.8 Web API. The API setups an HttpClient with the Merchant Identity Validation Certificate found in my apple account and calls the validation url passing in the required Json Validation Object. When I call PostAsync() I get an exception with the above error message Code is working successfully on my local machine but facing this issue while deployed on Dev / Model environment for testing. We have used Azure app service for deployment and TLS version 1.2 already present here. We have used the Merchant Identity certificate that was issued and have also checked with networking and infrastructure team to make its not an issue from our side. Does anyone have any other idea what could be causing this error. Thank you, Supriya
0
0
120
Jun ’25
Accessibility Personal Voice Issue with iOS & iPadOS Beta 1-6
Please refer to Feedback report: FB19701007 The Personal Voice file created in English changes to either Spanish or Chinese and no longer works properly. This has been happening since Beta 1 of iOS/iPadOS 26. I have been unable to pinpoint what causes this to occur. Possibly downloading foreign voices to a device or using different voices via AVSpeechSynthesizer. I run an app in Xcode on my device that prints to the console info about the installed voices. Initially after creation here is the output: Voice Identifier: com.apple.speech.personalvoice.16173F8D-DFB0-4024-98CC-69D965FD96A4 Language: en-US Then I hear a Spanish accent and find this: Voice Identifier: com.apple.speech.personalvoice.16173F8D-DFB0-4024-98CC-69D965FD96A4 Language: es-MX Currently it isn't working and here is the output: Voice Identifier: com.apple.speech.personalvoice.16173F8D-DFB0-4024-98CC-69D965FD96A4 Language: zh-CN Note that the voice file on all three above is the same. No matter what the user does, the created voice file should never be able to change languages. On my test devices I reset them all by erasing all content and settings and creating a new English Personal Voice and the issue persists. A side issue is the toggle share across devices doesn't remain off if turned off. I tried to not share to see if that could be the cause, but the toggle turns on automatically. It won’t remain off.
0
0
868
Aug ’25
Icon label's missing
Since the last bet upgrade for iPad to 26.3 labels have disappeared. Going into system/accessibility the toggle setting for labels makes no difference whether on or off. labels are permanently not there/missing.
1
0
1.1k
Jan ’26
Thai input glitch after Command + Delete in macOS beta
Issue: When using the shortcut Command + Delete to clear a line of text, the next character I type in Thai unexpectedly appears as an English character, even though the input source is still set to Thai. After that, subsequent characters return to Thai as expected. Details: Affected apps: Notes, Messages, and some other native apps Not affected: Browser text fields (Safari, Chrome, etc.) Does not occur when using Option + Delete or just Delete macOS [insert beta version + build number] Mac model: [insert model] Input sources: Thai – Kedmanee, English – U.S. Steps to reproduce: Open Notes (or Messages). Switch to Thai input. Type a few Thai words. Press Command + Delete. Type again — the first character shows up in English. Expected: First character should remain in Thai, consistent with the active input source. Actual: First character shows as English, then input switches back to Thai.
0
0
816
Aug ’25
Voice-over mode issues
Added a view controller in the storyboard, added a tableview in this view, and added a cell under the table, when I run the APP to jump to the page, when using the narration function, I find that when I use three fingers to swipe up or down, a sentence will be broadcast in English, I want to no longer change the accessiblity of the cell, when I perform the behavior of swiping up or down with three fingers, Broadcast how Chinese should be implemented.
1
0
77
Jun ’25
A Summary of the WWDC25 Group Lab - Accessibility
A Summary of the WWDC25 Group Lab - Accessibility At WWDC25 we launched a new type of Lab event for the developer community - Group Labs. A Group Lab is a panel Q&A designed for a large audience of developers. Group Labs are a unique opportunity for the community to submit questions directly to a panel of Apple engineers and designers. Here are the highlights from the WWDC25 Group Lab for Accessibility. Accessibility Nutrition Labels are a really big step forward for the experience people have on the App Store to find apps that will work for them. How should developers get started with Accessibility Nutrition Labels? A good starting point is to review the Accessibility Nutrition Label evaluation criteria on App Store Connect Help. It's a concise document, roughly 10 pages, and you can approach it section by section after the introduction. Even with prior experience using accessibility features like VoiceOver, the criteria offer valuable insights that might not be immediately apparent. For those newer to accessibility, a good entry point might be one of the visual feature labels, such as Dark Interface, which is a popular and frequently used feature. Which accessibility features can I indicate support for in Accessibility Nutrition Labels? The accessibility features covered include support for assistive technologies like VoiceOver and Voice Control, media enhancements such as captions and audio descriptions, and display accommodations. These display accommodations cover options like larger text, dark interface, differentiating without color alone, sufficient contrast, and reduced motion. With the new Accessibility Nutrition Labels, will app store reviewers validate what we select? The Accessibility Nutrition Label can be edited at any time without requiring a new app submission. However, if an app inaccurately claims feature support, App Review may contact the developer and request an update to the label or the app. Are there any updates to tools for analyzing the accessibility of our apps? Although there aren't new updates this year, continued support for Accessibility Audits is available through Xcode's built-in Accessibility Inspector. XCTest also supports accessibility audits, enabling developers to test app accessibility with every build. These audits analyze aspects like contrast, dynamic type, text clipping, element labels, and more within each view. For a deeper dive, the "Perform accessibility audits for your app" session from WWDC 2023 is a valuable resource. What are accessibility features you wish more people integrated? Accessibility features encompassing user input labels optimized for voice control, keyboard navigation and shortcuts, and dynamic type support could be more used to benefit users. What were some of the biggest accessibility challenges your team encountered while developing Liquid Glass? Apple is known for its innovation and strives to deliver a high-quality experience for everyone. Accessibility is considered a core component of visual design from the outset. For example, the Liquid Glass design inherently supports reduced transparency and increased contrast. As design continues to evolve, user feedback submitted through Feedback Assistant is invaluable. How does Liquid Glass respond to contrast? Especially for text and low contrast environments. Content legibility is a crucial aspect of the Liquid Glass design. It inherently supports accessibility features like reduced transparency and increased contrast. Your feedback during the beta period and beyond is essential to ensuring Liquid Glass provides a great experience within your apps. What are some Apple apps that stand out for their accessibility? Apps like Keynote in the iWork suite offer groundbreaking VoiceOver features to enhance creative productivity for all users. Assistive Access makes core apps such as Messages, Photos, Camera, Phone, and Music more accessible. Podcasts provides transcripts to broaden its reach, and frameworks like SwiftUI ensure that apps built with the latest UI frameworks have excellent built-in accessibility.
0
0
922
Jul ’25
What is the appropriate accessibility trait for selectable UITableViewCell?
I’m trying to understand the best practice for assigning accessibilityTraits to a UITableViewCell that users can select from a list of options. In Apple’s first-party apps like Settings, I’ve noticed an inconsistent approach—some cells use the Button trait, while others simply announce the label along with the Selected trait when applicable, without any additional role like Button or Adjustable. So my question is: What is the most appropriate accessibility trait to use for a selectable table view cell that updates a selection (like a settings option)? Is using .button the right approach, or should we rely solely on .selected? Is there any user experience guideline from Apple that recommends one over the other? Would love to hear how others handle this for clarity and consistency in VoiceOver behavior.
1
0
158
Apr ’25
False 3.1.1 Rejection: Real-World Dues Payments App
Hello everyone, Our community dues payment app only facilitates real-world maintenance-dues payments directly to property managers’ bank accounts. However, during testing it was likely flagged by the AI-driven review system for a metadata criterion and rejected under Guideline 3.1.1 (“Paid digital content must use IAP”). Meanwhile, hundreds of similar apps remain live on the App Store using the exact same model: The app is completely free No digital content or subscriptions are sold Dues payments are made via bank transfer or credit card directly to the manager Has anyone else encountered this? How did you overcome the metadata check in the AI-driven review process? Thanks!
0
0
119
May ’25
FamilyControls API access
I’m requesting access to the Family Controls API for an iOS app currently in development. I’ve submitted the request through the official form here: https://developer.apple.com/contact/request/family-controls-distribution However, after submitting, I receive no confirmation email or support ticket ID. The page only shows a “Thank you for requesting the API” message, and I’m left without a way to track or confirm the request. This entitlement is essential for my app’s functionality, and I need to move forward with development and testing. Can someone from the Apple team please confirm receipt of the request and provide guidance on the next steps or estimated timelines?
0
0
374
May ’25
Proposal: Using ARKit Body Tracking & LiDAR for Sign Language Education (Real-time Feedback)
Hi everyone, I’ve been analyzing the current state of Sign Language accessibility tools, and I noticed a significant gap in learning tools: we lack real-time feedback for students (e.g., "Is my hand position correct?"). Most current solutions rely on 2D video processing, which struggles with depth perception and occlusion (hand-over-hand or hand-over-face gestures), which are critical in Sign Language grammar. I'd like to propose/discuss an architecture leveraging the current LiDAR + Neural Engine capabilities found in iPhone devices to solve this. The Concept: Skeleton-based Normalization Instead of training ML models on raw video frames (which introduces noise from lighting, skin tone, and clothing), we could use ARKit's Body Tracking to abstract the input. Capture: Use ARKit/LiDAR to track the user's upper body and hand joints in 3D space. Data Normalization: Extract only the vector coordinates (X, Y, Z of joints). This creates a "clean" dataset, effectively normalizing the user regardless of physical appearance. Comparison: Feed these vectors into a CoreML model trained on "Reference Skeletons" (recorded by native signers). Feedback Loop: The app calculates the geometric distance between the user's pose and the reference pose to provide specific correction (e.g., "Raise your elbow 10 degrees"). Why this approach? Solves Occlusion: LiDAR handles depth much better than standard RGB cameras when hands cross the body. Privacy: We are processing coordinates, not video streams. Efficiency: Comparing vector sequences is computationally cheaper than video analysis, preserving battery life. Has anyone experimented with using ARKit Body Anchors specifically for comparing complex gesture sequences against a stored "correct" database? I believe this "Skeleton First" approach is the key to scalable Sign Language education apps. Looking forward to hearing your thoughts.
1
0
753
Dec ’25
Camera Crashes
Hi everybody, I'm trying to build a QR-Code Scanner and Generator App for IOS. Whenever I try to implement the camera the app crashes with this comment: This app has crashed because it attempted to access privacy-sensitive data without a usage description. The app's Info.plist must contain an NSCameraUsageDescription key with a string value explaining to the user how the app uses this data. I tried to reduce the app to the minimum of nothing but camera with the same result. Any ideas? Tank you and best Regards Horst Schippers
0
0
75
Apr ’25
TabItems in swiftUI do not scale
I have a TabView with a sample tabItem as follows: .tabItem { Label ("Import", systemImage:"doc.on.doc") .accessibilityLabel("Import Text") } But accessibility settings for large display size on does not seem to work, nor do dynamic font sizes: .tabItem { Label ("Import", systemImage:"doc.on.doc") .font(.largeTitle) .accessibilityLabel("Import Text") } The tabItems appear as a fixed size. The tab contents scale well, so this does not look pleasant at all. Is this a known bug in SwiftUI?
0
0
745
Jul ’25
Accessibility Traits for Children of a Tab Bar
Hi! I'm working on an application where I'd like VoiceOver to give each element of a tab bar the "Tab" trait. I'm testing this using the Accessibility Inspector. Essentially, I'd like to replicate the behavior of how Safari identifies each of its tabs as a "Tab" (I've attached a photo below). How exactly is this accomplished? I've tried using the .isTabBar trait to designate the child objects as "Tabs", but this doesn't seem to be working and I've struggled to find documentation about this. For additional context, these child items are Buttons, and I would like to have the .isButton trait essentially replaced by something like an .isTab trait. Not sure if this is actually possible or not, but curious how the Accessibility Inspector recognizes this in Safari.
0
0
180
Jun ’25