A display anomaly affecting photos shared from Android phones to iPhones has surfaced after an early iOS build labelled iOS 26 began circulating among testers, with users reporting that images appear washed with a red or magenta tint when viewed in Apple’s Photos app and certain messaging threads. The issue has drawn attention because it alters colours without warning and appears to affect images that look normal on the sending device and on other platforms.Early reports point to colour-management conflicts rather than file corruption. Images captured on Android devices using modern HDR pipelines, particularly those saved with extended colour profiles and high bit depth, are said to display inaccurately once imported or received on iPhones running the affected iOS build. Users say the tint becomes more pronounced in scenes with skin tones, warm lighting, or deep shadows, while screenshots and non-HDR images appear largely unaffected.
Testing by developers indicates the problem emerges when the iPhone attempts to interpret HDR metadata embedded in images created under Android’s Ultra HDR and Display P3-adjacent workflows. When the metadata is misread or partially ignored, the tone-mapping stage can skew red channels upward, leading to the visible cast. Similar behaviour has been observed in limited circumstances in earlier iOS versions, but the scope described by testers suggests a regression tied to changes in Apple’s photo rendering stack.
The issue has been most visible in photos received via cross-platform messaging and file-sharing apps, including images sent through encrypted chats or transferred using cloud links. In many cases, the same file appears normal when opened in a third-party image viewer on the iPhone or when exported to a Mac, suggesting the underlying data remains intact. This has reinforced the view that the glitch sits within Apple’s native rendering pipeline rather than the image itself.
Apple has not issued a public statement acknowledging the bug. Engineers familiar with iOS development note that early builds often include experimental changes to colour science and HDR handling, especially as Apple aligns its software with new display hardware and camera features across devices. Such changes can introduce compatibility problems with files generated outside Apple’s ecosystem, particularly as Android manufacturers have expanded HDR capture and adopted Google’s Ultra HDR standard across a growing range of handsets.
Android vendors have increasingly leaned on aggressive HDR tone mapping to improve perceived image quality, saving photos with gain maps and metadata designed to adapt dynamically to different displays. While the approach improves consistency across Android devices, it relies on receiving platforms to interpret the metadata correctly. Any mismatch can produce visible artefacts, with colour shifts among the most common symptoms.
Some testers have identified temporary workarounds. Disabling HDR viewing in Photos, converting images to standard dynamic range before sharing, or opening affected photos in third-party apps can restore more accurate colours. Resaving the image after a minor edit, such as cropping, has also been reported to strip problematic metadata, reducing the tint. These steps are viewed as stopgaps rather than solutions, particularly for users who rely on seamless cross-platform sharing.
The episode has reignited broader discussion around interoperability between Apple’s tightly controlled imaging pipeline and the more heterogeneous Android ecosystem. As both platforms push computational photography and extended colour, the margin for misinterpretation grows. Industry analysts note that colour-management bugs, while rarely headline-grabbing, can erode trust when users feel their photos are being altered without consent.