Apple Releases White Paper on Protecting Children Online, Unveiling New Safety Features
Apple Releases White Paper on Protecting Children Online, Unveiling New Safety FeaturesApple has released a white paper, "Helping Protect Children Online," on its developer website, announcing several new features designed to enhance children's online safety. These features aim to provide a safer, more age-appropriate app experience for children and address the growing global concern surrounding online child safety
Apple Releases White Paper on Protecting Children Online, Unveiling New Safety Features
Apple has released a white paper, "Helping Protect Children Online," on its developer website, announcing several new features designed to enhance children's online safety. These features aim to provide a safer, more age-appropriate app experience for children and address the growing global concern surrounding online child safety.
The white paper outlines key measures Apple is taking to bolster child online safety, primarily focusing on: an updated age rating system, a streamlined child account setup process, child account permission groups, and a new age API. The combined features aim to provide parents and developers with stronger tools to ensure children are adequately protected when using Apple devices and apps.
Stricter Age Ratings and Account Setup
Apple has updated its existing age rating system, refining the four categories (4+, 9+, 12+, and 17+) to five (4+, 9+, 13+, 16+, and 18+). This more granular classification provides a more accurate reflection of app content maturity and better assists parents in selecting appropriate apps for their children's ages.
The new age rating system more clearly defines the types of content associated with each age rating. For example, 9+ apps may contain mild cartoon violence, profanity, or crude humor; 13+ apps may include mild medical or treatment-related content, alcohol or tobacco use, suggestive themes or nudity, realistic violence, or simulated gambling; while 16+ and 18+ apps contain more mature content such as unrestricted web access, frequent or intense sexual themes, and gambling.
To further streamline child account setup, Apple is improving the account creation process. When creating a new Apple account, the system will ask for the user's age range. If the account is for a child under 13, the system automatically displays the "Family Sharing" option, guiding parents through necessary parental control settings. This ensures parents can set appropriate permissions and restrictions before their child uses the device. Even if a parent isn't present, a child can create an account and use the device, but an "age wall" will automatically be enabled to filter inappropriate content. Parents can modify these settings, disable, or adjust the "age wall" at any time.
Developer Tools: Age API and App Information Disclosure
Apple will also provide developers with a new age range API to help them confirm a user's age range, ensuring children are not exposed to content inappropriate for their age. This API is designed to protect user privacy, preventing apps from accessing children's specific personal information. Parents can choose whether to share age range information with developers, and Apple is committed to minimizing data collection while providing the service.
Furthermore, the white paper requires third-party developers to disclose whether their app includes personalized (advertising) content based on user profiles, whether it has an "age wall," and whether it offers parental controls. This information will be clearly displayed on the App Store product page, enabling parents to make informed choices before downloading an app.
Responding to Global Child Online Safety Regulatory Trends
Apple's white paper highlights its commitment to child online safety and emphasizes its efforts to align with growing global regulatory trends. The US is considering stricter child protection legislation requiring app store operators to verify age and obtain parental consent before allowing minors to download apps. The UK and Australia, among other locations, have already implemented various laws and regulations requiring social media companies to ensure children cannot access inappropriate content.
Apple's improvements to the App Store are a direct response to this context. The company is committed to creating a safer, more positive digital environment for children and is working with global regulators to protect children online. By improving its age rating system, streamlining account setup, providing developer tools, and actively responding to global regulatory trends, Apple demonstrates its determination and commitment to protecting children online. Their goal is to collect the minimum data necessary to provide the service while protecting user privacy and ensuring children can safely use their devices and apps. Apple states that while a small percentage of apps on the App Store might require age verification, mandating all users to provide sensitive personally identifiable information is not only detrimental to user privacy but also inconvenient for users who don't need to provide such information. Therefore, Apple will continue exploring the best way to balance child safety and user privacy.
Tag: Apple Releases White Paper on Protecting Children Online Unveiling
Disclaimer: The content of this article is sourced from the internet. The copyright of the text, images, and other materials belongs to the original author. The platform reprints the materials for the purpose of conveying more information. The content of the article is for reference and learning only, and should not be used for commercial purposes. If it infringes on your legitimate rights and interests, please contact us promptly and we will handle it as soon as possible! We respect copyright and are committed to protecting it. Thank you for sharing.