Apple has slowly been making its devices easier to fix, but the iPhone 15 fell short in a couple of key areas, according to the repairability site iFixit. Namely, the battery was hard to remove and the device suffered from a “parts pairing” issue that meant you couldn’t easily replace the LiDAR sensor with one from another phone. With those two problems, iFixit gave the iPhone 15 a relatively low 4/10 repairability score.
Apple has now released new updates on iPhone 16 repairability and appears to have addressed both those issues and a bunch more. Saying it tries to strike a balance between durability and repairability, it focused particularly on the “repairability” aspect with its latest devices.
There’s now an entirely new way to remove the battery that’s supposed to make it easier. By running a low voltage electrical current through the new ionic liquid battery adhesive (using a 9V cell, for instance), the battery will release itself from the enclosure. This makes removal faster and safer compared to previous stretch release adhesives, according to the company.
At the same time, Apple made changes to the Face ID sensor hardware starting with the iPhone 16 and iPhone 16 Pro. Now, the TrueDepth Camera can be swapped from one unit to another without compromising security or privacy. Before, only Apple was able to do that type of repair.
Another big change is the new Repair Assistant, designed to address parts pairing issues. That lets customers and repair professionals configure both new and used Apple parts directly on the device, with no need to contact Apple personnel. Repair shops previously needed to order official components directly from Apple and get on the phone with an employee before iOS would accept individual parts replacements.
Apple added newly repairable modules too, saying the TrueDepth Camera can now be configured on-device for iPhone 12 and later, eliminating the need for a tethered Mac. In addition, the LiDAR scanner on iPhone Pro models is now serviceable with the rear camera model.
Another big change is on-device access to diagnostics. Starting with iOS 18, Apple diagnostics for repair will be available on device, so customers can determine which parts need to be replaced without the need for a second device.
Finally, the company announced new support for third-party and used Apple parts. If a third-party part can’t be calibrated on Apple’s cloud-based servers, the iPhone or other device will try to activate the part and operate it to its full capability, while showing the repair history within settings. Used Apple parts can soon be calibrated and will appear as a “used” part in the device’s repair history. Another future update will enable True Tone for third-party displays and battery health for third-party batteries. In addition, the LiDAR Scanner and front camera will still work when the module is replaced and left unconfigured.
All told, the iPhone 16 series looks to have one of the biggest jumps in repairability yet, with improvements in physical access, parts compatibility and parts pairing. We’ll soon see if that’s reflected in iFixit’s impending repairability score.
The thing to understand about Apple Watch releases, is they aren’t targeting last year’s buyers. Nor even the year prior. Instead, they’re targeting previous Apple Watch owners 3-5 generations back. And true to that form, Apple each year just slightly increases the specs to make the upgrade path a bit more appealing. A bit more shiny. And a bit more brilliant.
Which, is exactly what the Apple Watch Series 10 does. There are no earth-shattering new features here for most consumers, well, unless you snorkel a lot. But under the covers, it’s a substantial change in terms of the internals to fit into the thinnest Apple Watch to date, with also the biggest screen on any Apple Watch to date. And you’ll certainly notice that big screen when comparing it to even just the 45mm unit from last year. Instead, the majority of the new features come from the also new WatchOS 11, bringing in sports training load, overnight sleep metric trending, a new Tides app, and more.
I’ve been putting all these features to the test, both in the Apple Watch Series 10, as well as within watchOS 11, to see how well they work both day-to-day, as well as in sports applications. As usual, this watch is a media loaner, and it’ll go back to Apple. After which I’ll go out and get my own for any future testing needs. If you found this review useful, you can use the links at the bottom, or consider becoming a DCR Supporter, which makes the site ad-free, while also getting access to a behind-the-scenes video series And of course, it makes you awesome.
With that, let’s get into it!
What’s New:
As is always the case with the Apple Watch releases, it’s really divided into two major camps: Things that are new on the watch itself (usually hardware) relative to the previous hardware version, and then things that are new due to the new WatchOS platform (which is announced months earlier in June). In this review, I’m aiming to cover both of those things, given I’ve spent all summer on WatchOS 11 since back in early June.
Starting on the Series 10-specific side first, here’s what’s new there:
– Totally new design internally, and new model sizes being introduced: 42mm/46mm – Biggest usable display area to date in any Apple Watch: 374x446px for the 42mm 416x496px for the 46mm – Thinnest design to date (9.7mm versus 10.7mm on Series 9) – New ‘Ionic Glass’ screen design, which tapers further down the edge of the display – New “Wide Angle OLED’ display, which is 40% brighter when viewed on an angle – Always-on display mode will now show seconds even when wrist is down (updates at 1-second rate in standby mode, versus 1-minute rate) – Added ability to have speaker play media/music (previously it wouldn’t, had to connect headphones) – New faster charging times, 80% charge in 30 mins (fastest Apple Watch to date) – New S10 sip (chipset) inside the Series 10 – Adds new voice calling automatic background removal with new neural network – Adds depth gauge (supports depths to 6m/20ft for snorkeling – still maintains 50m waterproofing) – Adds temperature sensor (supports water temperature for swimming activities) – Adds new ’Tides’ app to show tidal data globally – Adds snorkeling support for Series 10 (via 3rd party Oceanic App/partnership) – Adds sleep apnea detection, monthly reporting with analysis reports – Adds new sleep metric: breathing disturbances (which feeds into sleep apnea detection) – Adds new ‘Flux’ watch face – Adds new ‘Reflections’ watch face – Adds new metal backplate to watch – Three color options for base edition: Rose Gold, Silver Aluminum, Jet Black – Three new polished titanium versions, weigh 20% less than existing stainless steel variants – Titanium Series 10 is a carbon-neutral product – Same pricing at $399 (42mm) & $429 (46mm), or $499/$529 for the cellular editions. – Pricing for titanium is $699 for the 42mm, and $749 for 46mm (but includes cellular) – Shipping on September 20th, 2024
Got all that? Good, now we get to layer in the WatchOS 11 features, which cover plenty of areas, but I’ll be mostly focusing on the sports/fitness/health ones. They are as follows:
– Added new workout Training Load features – Added Vitals app (for trending overnight sleep metrics) – Added custom routes for hiking/walking/running – Added true offline maps to watch – Added ability to pause Activity Rings – Added ability to adjust goals on a per-day basis (e.g. Saturday) – Added structured swim workouts – Added distance and route maps to a pile of sport types – Added safety check-in feature at start/end of work-out – Added intelligent Smart Stack – Added automatic offline language translations when you arrive in a country – Added new watch faces/styles
And again, there’s a smattering of other very minor tweaks as well, but that’s the bulk of them.
Finally, note that there’s no change in the SpO2/blood oxygen sensor status. That remains as it has been since earlier this year, wherein Apple is unable to sell units with it enabled within the US. Existing users can keep the feature on their watch, as well as all non-US purchases. But units purchased in the US will not have that feature enabled. The hardware remains the same, so in the (seemingly unlikely) event something changes in a court case, it could be re-enabled. In fact, you’ll notice in WatchOS 11 on the Series 10, the user interface slot that was there in the ‘Vitals’ app for SpO2 is now removed.
Apple Intelligence is an artificial intelligence platform developed by Apple Inc.[1] Relying on a combination of on-device and server processing, it was announced on June 10, 2024 at WWDC 2024 as an integration for Apple’s iOS 18, iPadOS 18, and macOS Sequoia operating systems, which were announced alongside Apple Intelligence. Apple Intelligence is free for all users with supported devices, and GPT-4o integration is also free for all users without needing to sign in.[2] It is not currently in the developer betas for iOS 18, iPadOS 18, and macOS Sequoia, however it is scheduled to be released as a developer beta sometime during the summer.
Functionality and features
Apple Intelligence features writing tools that are powered by AI, including Rewrite and Proofread. This feature helps enhances a users’ writing to make it more friendly, concise and professional, similar to Grammarly’s AI writing features. It can also be used to generate summaries, key points, tables, and lists from an article or piece of writing.[3][4] ChatGPT will be available as part of Writing Tools, which help users generate content for anything they are writing about. With Compose, users can access ChatGPT image tools to generate images to complement what they are writing.[5]
Image Playground
Apple Intelligence can also be used to generate images on-device by using the Image Playground app. Similarly to OpenAI’s DALL-E, it can be used to generate images using AI, using phrases and descriptions to create an image with customizable styles such as Animation and Sketch. In Notes, users can access Image Playground on iPad through the new Image Wand tool in the Apple Pencil palette without having to open the Image Playground app. Rough sketches can be transformed into images, and users can also select empty space to create an image using context from the surrounding area.
Image Playground is also available in apps like Keynote, Freeform, and Pages, as well as in third-party apps that use the Image Playground API. [6]
Using Apple Intelligence, users can create an original Genmoji by simply typing a description. Users can pick someone from their photos and create a Genmoji that resembles them. Similar to emoji, Genmoji can be added inline to text messages and can also be shared as a sticker or reaction in a Tapback on iMessages and other supported third-party apps.
Siri, Apple’s virtual assistant, has been updated with enhanced capabilities made possible by Apple Intelligence. The latest iteration features an updated user interface, improved natural language processing, and the option to interact via text by double tapping the home bar without enabling the feature in the Accessibility menu. In addition, Apple Intelligence adds the ability for Siri to use personal context from device activities to make conversations feel more natural and fluid. Siri can also help users navigate their device more easily, give users device support everywhere they go, and will have a larger app support via the Siri App Intents API.[7][8] Siri will be able to deliver intelligence that’s tailored to the user and their on-device information. For example, a user can say, “Play that podcast that Jamie recommended,” and Siri will locate and play the episode, without the user having to remember whether it was mentioned in a text or an email. They could also ask, “When is Mom’s flight landing?” and Siri will find the flight details and cross-reference them with real-time flight tracking to give an arrival time.[5]
Apple Intelligence adds a feature called Priority Messages to the Mail app. A new section at the top of the inbox shows the most urgent emails. Across a user’s inbox, instead of previewing the first few lines of each email, they can see summaries of the email without needing to open it. Smart Reply provides suggestions for a quick response, and will identify and outline questions in an email.[5]
Apple’s Photos app includes a feature to create custom memory movies and enhanced search capabilities. Users can describe the story they want to see, and using Apple Intelligence, selects matching photos and videos. It organizes these into a movie with a narrative arc based on identified themes. Additionally, users can search for specific photos or videos by description and/or keyword, and Apple Intelligence can pinpoint particular moments within video clips.[9]
As a result of the company’s partnership with OpenAI, Apple Intelligence also includes a system-wide integration with ChatGPT, allowing Siri to determine when to send certain complex user requests to ChatGPT. This system-wide integration is powered by GPT-4o.[5] Users are asked before any questions are sent to ChatGPT, along with any documents or photos, and Siri then presents the answer directly.[5][10] Using ChatGPT features are free for all users without needing to sign in, however paid subscribers can sign in to gain access to paid features systemwide. All requests made through Apple Intelligence are not stored by OpenAI, and users’ IP addresses are obscured; however if users can also choose to connect their ChatGPT account, their data preferences will apply under ChatGPT’s policies.[11]
Although Apple Intelligence largely utilises on-device processing, for more complex tasks Apple Intelligence can scale accordingly its computational capacity and draw on server-based models for more complex requests. These models run on servers powered by Apple silicon, which allows Apple to ensure that data is never retained or exposed. Private Cloud Compute cryptographically ensures that Apple Intelligence does not talk to a server and independent experts can inspect the code that runs on these Apple silicon servers to check and verify user privacy.[12][13][2][14]
In October 2015, Apple Inc. acquired Perceptio, an on-device artificial intelligence modeling company.[15] Following the acquisition, Apple engaged in efforts to ensure its artificial intelligence operations remained covert; according to University of California, Berkeley professor Trevor Darrell, the company’s secrecy deterred graduate students.[16]
The Private Cloud Compute platform powering Apple Intelligence is designed heavily with user privacy and end-to-end encryption in mind. Independent experts can inspect the code that runs on Private Cloud Compute servers to verify user privacy, and Private Cloud Compute cryptographically ensures that Apple Intelligence does not talk to a server unless its software has been logged for inspection.[13]
Apple Vision Pro available in the U.S. on February 2
The era of spatial computing is here — pre-orders begin Friday, January 19
Apple today announced Apple Vision Pro will be available beginning Friday, February 2, at all U.S. Apple Store locations and the U.S. Apple Store online. Vision Pro is a revolutionary spatial computer that transforms how people work, collaborate, connect, relive memories, and enjoy entertainment. Vision Pro seamlessly blends digital content with the physical world and unlocks powerful spatial experiences in visionOS, controlled by the most natural and intuitive inputs possible — a user’s eyes, hands, and voice. An all-new App Store provides users with access to more than 1 million compatible apps across iOS and iPadOS, as well as new experiences that take advantage of the unique capabilities of Vision Pro. Pre-orders for Apple Vision Pro begin Friday, January 19, at 5 a.m. PST.
“The era of spatial computing has arrived,” said Tim Cook, Apple’s CEO. “Apple Vision Pro is the most advanced consumer electronics device ever created. Its revolutionary and magical user interface will redefine how we connect, create, and explore.”
A Revolutionary Operating System and User Interface
Apple Vision Pro is powered by visionOS, which is built on the foundation of decades of engineering innovation in macOS, iOS, and iPadOS. visionOS delivers powerful spatial experiences, unlocking new opportunities at work and at home. Featuring a brand-new three-dimensional user interface and input system controlled entirely by a user’s eyes, hands, and voice, navigation feels magical. Intuitive gestures allow users to interact with apps by simply looking at them, tapping their fingers to select, flicking their wrist to scroll, or using a virtual keyboard or dictation to type. With Siri, users can quickly open or close apps, play media, and more.
Pause playback of video: visionOS on Apple Vision Pro
visionOS features a brand-new three-dimensional user interface controlled entirely by a user’s eyes, hands, and voice.
Users can also immerse themselves in Environments — dynamic, beautiful landscapes like Haleakalā, Joshua Tree, and Yosemite national parks, and even the surface of the moon — to help them focus or reduce clutter in busy spaces. With Environments, a user’s world can grow beyond the dimensions of a physical room. With a twist of the Digital Crown, users can control how present or immersed they are in an environment.
With Environments, a user’s world can grow beyond the dimensions of a physical room, using the Digital Crown to control how present or immersed they are.
Extraordinary Experiences
Apple Vision Pro brings a new dimension to powerful, personal computing by changing the way users interact with their apps. The three-dimensional interface frees apps from the boundaries of a display so they can appear side by side at any scale, providing the ultimate workspace and creating an infinite canvas for multitasking and collaborating.
Since visionOS leverages existing developer frameworks, more than 1 million familiar apps across iOS and iPadOS are available on Apple Vision Pro and automatically work with the new input system. Vision Pro also has an all-new App Store where users can find apps that deliver spatial computing experiences unlike any other platform. Apps can be arranged anywhere and scaled to the perfect size, all while allowing the user to stay present in their space.
An infinite canvas for productivity:With key productivity and collaboration apps like Fantastical, Freeform, JigSpace, apps from Microsoft 365, and Slack, Apple Vision Pro is an ideal productivity tool for everyday tasks. Apps can appear side by side at any scale for incredible multitasking, and with support for Magic Keyboard and Magic Trackpad, users can create the perfect workspace. With Mac Virtual Display, users can even bring the powerful capabilities of their Mac into Vision Pro, creating an enormous, private, and portable 4K display, ideal for pro workflows.
The ultimate entertainment experience: Apple Vision Pro features ultra-high-resolution displays that deliver more pixels than a 4K TV for each eye, enabling users to watch movies and TV shows from Apple TV+, Disney+,1 Max, and other services on a screen that feels 100 feet wide with support for HDR content. Within the Apple TV app, users can access more than 150 3D titles with incredible depth wherever they are. Vision Pro also introduces Apple Immersive Video, a remarkable new entertainment format pioneered by Apple that puts users inside the action with 180-degree, three-dimensional 8K recordings captured with Spatial Audio. Users can also enjoy new interactive experiences like Encounter Dinosaurs.
New gaming experiences: Players can access games on the App Store, including more than 250 titles on Apple Arcade. Hit games like NBA 2K24 Arcade Edition and Sonic Dream Team can be played on a screen as large as they want with incredible audio and support for popular game controllers. New spatial games, including Game Room, What the Golf?, and Super Fruit Ninja, take advantage of the powerful capabilities of Apple Vision Pro to transform the space around players, offering unique and engaging gameplay experiences.
Pause playback of video: App Store for Apple Vision Pro
An all-new App Store on Apple Vision Pro features more than 1 million apps, including new experiences that take advantage of the unique capabilities of Vision Pro.
Memories Brought to Life
Apple Vision Pro enables users to capture and relive their favorite memories in entirely new ways. Spatial photos and videos transport users back to a special moment in time, and Spatial Audio makes the experience incredibly immersive. When users are on the go, they can capture spatial video on their iPhone 15 Pro or iPhone 15 Pro Max and relive them on Vision Pro. Users can also view all their photos and videos at a life-size scale with brilliant color and spectacular detail, including Panoramas that expand and wrap around the user, making them feel like they are right where it was taken.
Pause playback of video: Apple Vision Pro Spatial Video
Spatial photos and videos transport users back to a special moment in time, and Spatial Audio makes the experience incredibly immersive.
FaceTime Becomes Spatial
FaceTime on Apple Vision Pro takes advantage of the space around the user so that everyone on a call appears life-size, while Spatial Audio makes it sound like each person’s voice comes from the location of their tile. If a user is wearing Vision Pro while on FaceTime, they appear as their Persona, while others joining from a Mac, iPad, or iPhone will appear in a tile.
Persona is an authentic spatial representation of an Apple Vision Pro user that enables others on a call to see their facial expressions and hand movements — all in real time.2 Using machine learning techniques, a Persona can be created in just minutes using Vision Pro. Personas also work in third-party videoconferencing apps including Zoom, Cisco Webex, and Microsoft Teams.
Persona is an authentic spatial representation of an Apple Vision Pro user that enables others on a call to see the user’s facial expressions and hand movements in real time.
Breakthrough Design
Apple Vision Pro builds on Apple innovation and experience designing high-performance products like Mac, iPhone, and wearables like Apple Watch, culminating in the most advanced personal electronics device ever. An astonishing amount of technology is packed into a beautiful, compact design that utilizes the most advanced materials possible to achieve ambitious goals for performance, mobility, and wearability.
Apple Vision Pro is designed as a modular system so users can personalize their fit. A singular piece of three-dimensionally formed, laminated glass gently curves around the user’s face and flows into the custom aluminum alloy frame. The Light Seal is made of a soft textile and comes in a range of shapes and sizes, flexing to conform to a user’s face for a precise fit. Flexible straps ensure audio remains close to the user’s ears, while the included Solo Knit Band and Dual Loop Band allow users to find the optimal fit for them. For those with vision correction needs, ZEISS Optical Inserts are available with a prescription or as readers that magnetically attach to Vision Pro, allowing users to take full advantage of the display’s incredible sharpness and clarity.3
Unrivaled Innovation
Apple Vision Pro is designed to deliver phenomenal compute performance in a compact wearable form factor. Featuring a breakthrough ultra-high-resolution display system built on top of Apple silicon, Vision Pro uses micro-OLED technology to pack 23 million pixels into two displays, each the size of a postage stamp, with wide color and high dynamic range. This technological breakthrough, combined with custom lenses that enable incredible sharpness and clarity, and advanced Spatial Audio, delivers jaw-dropping experiences.
Apple Vision Pro also features a high-performance eye tracking system that uses high-speed cameras and a ring of LEDs that project invisible light patterns onto the user’s eyes for responsive, intuitive input. And to help the user stay connected to the people around them, Apple designed a groundbreaking new feature called EyeSight. When a person approaches someone wearing Vision Pro, the device looks transparent — letting the user see them while also displaying the user’s eyes. When a user is immersed in an Environment or using an app, EyeSight gives visual cues to others about what the user is focused on.
The breakthrough display, advanced audio experiences, high-performance eye tracking system, and more are powered by Apple silicon in a unique dual-chip design. The M2 chip delivers powerful standalone performance, while the brand-new R1 chip processes input from 12 cameras, five sensors, and six microphones to ensure that content feels like it is appearing right in front of the user’s eyes.
Privacy and Security at Its Core
Apple Vision Pro offers industry-leading privacy and security. Optic ID is a new authentication system that analyzes a user’s iris to unlock Vision Pro, autofill passwords, and complete payments with Apple Pay. Where a user looks stays private while navigating Vision Pro, and eye tracking information is not shared with Apple, third-party apps, or websites. EyeSight also includes a visual indicator that makes it clear to others when a user is capturing a spatial photo or video.
Accessibility in visionOS
As with all Apple products, powerful accessibility features have been built right into visionOS. Key accessibility features like VoiceOver, Zoom, Switch Control, Guided Access, and more have been reimagined for spatial computing. Users can interact with Apple Vision Pro entirely with their eyes, hands, or voice, or any combination that works best for them. They can select a preferred input method such as their eyes, finger, or wrist using Pointer Control, pause on an element of visionOS for a few seconds to simulate a tap using Dwell Control, or simply use voice commands for activities across Vision Pro using Voice Control. If input from both eyes is not an option, visionOS also allows eye tracking with one dominant eye.
Apple Vision Pro and the Environment
Apple Vision Pro is designed with the environment in mind, with 100 percent recycled rare earth elements in all magnets and 100 percent recycled tin soldering and gold plating in multiple printed circuit boards. The frame and battery enclosure contain 100 percent recycled aluminum, and the Light Seal and Solo Knit Band are each made with over 70 percent recycled yarn. Vision Pro meets Apple’s high standards for energy efficiency and is free of mercury, brominated flame retardants, PVC, and beryllium. The packaging is 100 percent fiber-based, bringing Apple closer to its goal of eliminating plastics in all packaging by 2025.
Today, Apple is carbon neutral for its global corporate operations, and by 2030, plans to be carbon neutral across the entire manufacturing supply chain and life cycle of every product.
Pricing and Availability
Apple Vision Pro will be available starting at $3,499 (U.S.) with 256GB of storage. Pre-orders for Apple Vision Pro will begin on Friday, January 19, at 5 a.m. PST, with availability beginning Friday, February 2.
Apple Vision Pro will be available at all U.S. Apple Store locations and the U.S. Apple Store online.
ZEISS Optical Inserts — Readers will be available for $99 (U.S.), and ZEISS Optical Inserts — Prescription will be available for $149 (U.S.).
Apple Vision Pro comes with a Solo Knit Band and Dual Loop Band — giving users two options for the fit that works best for them. Apple Vision Pro also includes a Light Seal, two Light Seal Cushions, an Apple Vision Pro Cover for the front of the device, Polishing Cloth, Battery, USB-C Charge Cable, and USB-C Power Adapter.
Samsung Electronics today unveiled the Galaxy S24 Ultra, Galaxy S24+ and Galaxy S24, unleashing new mobile experiences with Galaxy AI.1 Galaxy S series leads the way into a new era that will forever change how mobile devices empower users. AI amplifies nearly every experience on Galaxy S24 series, from enabling barrier-free communication with intelligent text and call translations, to maximizing creative freedom with Galaxy’s ProVisual Engine, to setting a new standard for search that will change how Galaxy users discover the world around them.
“The Galaxy S24 series transforms our connection with the world and ignites the next decade of mobile innovation,” said TM Roh, President and Head of Mobile eXperience (MX) Business at Samsung Electronics. “Galaxy AI is built on our innovation heritage and deep understanding of how people use their phones. We’re excited to see how our users around the world empower their everyday lives with Galaxy AI to open up new possibilities.”
Make Everyday Experiences Epic
Galaxy AI introduces meaningful intelligence aimed at enhancing every part of life, especially the phone’s most fundamental role: communication. When you need to defy language barriers, Galaxy S24 makes it easier than ever. Chat with another student or colleague from abroad. Book a reservation while on vacation in another country. It’s all possible with Live Translate,2 two-way, real-time voice and text translations of phone calls within the native app. No third-party apps are required, and on-device AI keeps conversations completely private.
With Interpreter, live conversations can be instantly translated on a split-screen view so people standing opposite each other can read a text transcription of what the other person has said. It even works without cellular data or Wi-Fi.
For messages and other apps, Chat Assist can help perfect conversational tones to ensure communication sounds as it was intended: like a polite message to a coworker or a short and catchy phrase for a social media caption.3 AI built into Samsung Keyboard can also translate messages in real time in 13 languages.4 In the car, Android Auto5 will automatically summarize incoming messages and suggest relevant replies and actions, like sending someone your ETA, so you can stay connected while staying focused on the road.
Organization also gets a big boost with Note Assist6 in Samsung Notes, featuring AI-generated summaries, template creation that streamlines notes with pre-made formats and cover creation to make notes easy to spot with a brief preview. For voice recordings, when there are multiple speakers, Transcript Assist7 uses AI and Speech-to-Text technology to transcribe, summarize and even translate recordings.
Communication isn’t the only way Galaxy S24 series takes the fundamental benefits of the phone into the future. Online search has transformed nearly every aspect of life. Galaxy S24 marks a milestone in the history of search as the first phone to debut intuitive, gesture-driven Circle to Search8 with Google. To give Galaxy users an incredible new tool, Galaxy turned to the worldwide leader of search, Google, and opened up new forms of discovery with a simple gesture. With a long press on the home button, users can circle, highlight, scribble on or tap anything on Galaxy S24’s screen to see helpful, high-quality search results. Seeing a beautiful landmark in the background of a friend’s social media post or a surprising fun fact on YouTube Shorts can quickly become an accurate search to learn more – without having to leave that app. And depending on a user’s location, for certain searches, generative AI-powered overviews can provide helpful information and context pulled together from across the web, and users can ask more complex and nuanced questions. It’s that easy. And that epic.
Unleash Creativity to Discover the World in New Ways
Galaxy S24 series’ ProVisual Engine9 is a comprehensive suite of AI-powered tools that transforms image capturing abilities and maximizes creative freedom every step of the way, from setting up a shot all the way to sharing it on social. Gone are shaky, pixelated images taken from far away. Galaxy S24 Ultra’s Quad Tele System, with new 5x optical zoom lens, works with the 50MP sensor to enable optical-quality performance at zoom levels from 2x, 3x, 5x to 10x10 magnification thanks to Adaptive Pixel Sensor. Images also show crystal clear results at 100x with enhanced digital zoom.
With upgraded Nightography capabilities, photos and videos shot on Galaxy S24 Space Zoom are brilliant in any condition, even when zoomed in. Capture more light in dim conditions with Galaxy S24 Ultra’s larger pixel size, now 1.4 μm, which is 60% bigger11 compared to the previous model. Blur is reduced on Galaxy S24 Ultra with wider optical image stabilizer (OIS) angles and enhanced hand-shake compensation. When recording videos, both front and rear cameras are equipped with Dedicated ISP Block for noise reduction, and Galaxy S24 analyzes gyro information to distinguish between the filmer’s movement and the subject’s. This allows for more effective noise removal and clear videos in the dark, even from far away.
After great shots are captured, innovative Galaxy AI editing tools enable simple edits like erase, re-compose, and remaster. For easier and more efficient optimizations, Edit Suggestion12 uses Galaxy AI to suggest perfectly suitable tweaks for each photo. To give users even more creative control and freedom, Generative Edit13 can fill in parts of an image background with generative AI. When a picture is crooked, AI will fill in the borders. When an object needs to be slightly moved to be in the perfect position, AI lets users adjust the position of the subject and generates a perfectly blended background in its original spot. Anytime Galaxy S24 deploys generative AI to amplify an image, a watermark will appear on the image and in metadata. And if an action-packed video needs to be slowed down, new Instant Slow-mo can generate additional frames based on movements to smoothly slow down action-packed moments for a more detailed look.
To ensure every image stuns at every stage, Super HDR reveals lifelike previews before the shutter is ever pressed. And while capturing memories is an essential phone feature, sharing memories with the world is just as meaningful. Now, third-party social apps make the most of Galaxy’s AI-powered camera system. Premium Galaxy S series camera features now integrate directly with mobile apps in HDR to level up social sharing.14 When it’s time to find an image in Gallery or Instagram feed and reels, photos and videos are also shown in Super HDR for a more lifelike range of brightness, color and contrast by analyzing highlighted section of the images.
Galaxy’s Most Intelligent Experience Ever, Powered by Premium Performance
As AI becomes a more prominent part of everyday life, performance power must excel to meet the demands. Gaming. Heavy-duty video recording and editing. Jumping between five apps to plan a trip. Whatever the task, Galaxy S24 provides an incredible experience thanks to enhancements in its chipset,15 display and more. Every Galaxy S24 Ultra is equipped with Snapdragon® 8 Gen 3 Mobile Platform for Galaxy.16 Optimized especially for Galaxy users, this chipset delivers remarkable NPU improvement for incredibly efficient AI processing. In all three Galaxy S24 models, 1-120 Hzadaptive refresh rates also improve performance efficiency.
Galaxy gaming is more powerful thanks to hardware and software improvements. Galaxy S24 Ultra boast an optimal thermal control system with a 1.9 times larger vapor chamber,17 improving device surface temperature while also maximizing sustained performance power. Ray tracing enables lifelike visuals with superior shadow and reflection effect. And through collaboration with industry-leading gaming partners, Galaxy S24 let users enjoy more optimized popular global mobile games.
Visuals are more vibrant and captivating on the brightest Galaxy display ever.18 Galaxy S24 reaches 2,600nit peak brightness and delivers improved outdoor visibility with Vision Booster.
On the display, Corning® Gorilla® Armor19 on the Galaxy S24 Ultra is optically enhanced and demonstrates superior durability against damage caused by everyday scratches. It delivers dramatically reduced reflection by up to 75% in a wide range of lighting conditions, ensuring a smooth, comfortable viewing experience.
Across the Galaxy S24 series, design enhancements with slimmer and even bezels make it easier to immerse in any viewing experience and enable larger screen sizes on Galaxy S24+’s 6.7-inch and Galaxy S24’s 6.2-inch displays within nearly the same size specifications.20 Galaxy S24 Ultra has a 6.8-inch flatter display, optimized not just for viewing but also for productivity. Plus, Galaxy S24+ now supports the same level of QHD+ found on Galaxy S24 Ultra.
Advanced Security and Privacy Empowers User Choice and Trust
Secured by Samsung Knox, Galaxy’s defense-grade, multi-layer security platform, Galaxy S24 safeguards critical information and protects against vulnerabilities with end-to-end secure hardware, real-time threat detection and collaborative protection.
Samsung’s long-standing commitment to provide users choice and control over their device continues in the era of AI. Galaxy S24 users have full controllability over how much they allow their data to enhance AI experiences, through Advanced Intelligence settings which can disable online processing of data for AI features.21
The Knox Matrix22 vision of a secure, connected and password-less future is also advanced with passkeys. Passkeys enable convenient and secure access to a users’ registered websites and apps across all their trusted devices through digital credentials, helping protect against phishing attacks. Enhanced Data Protection offers end-to-end encryption when users backup, sync or restore their data with Samsung Cloud, allowing Galaxy S24 users to connect to other devices while staying synchronized and secure. This ensures the data can only be encrypted or decrypted on a user’s devices, meaning nobody can see it but the user, even if a server is compromised or account details are stolen. And if access to a trusted device is lost, a recovery code can help prevent loss of data. Galaxy S24 is also protected with Samsung’s expansive list of innovative security and privacy features including Knox Vault, Security & Privacy Dashboard, Auto Blocker, Secure Wi-Fi, Private Share, Maintenance Mode and more.
The Next Phase of Samsung’s Environmental Journey
Establishing a new category of mobile experiences also means reimagining how Galaxy technology is designed and packaged to do more with less for people and the planet. Galaxy S24 continues to scale the variety of recycled materials in Galaxy devices by applying recycled plastics, glass and aluminum to internal and external components.23 It also takes these efforts one step further. For the first time, Galaxy S24 features components made with recycled cobalt and rare earth elements.24 In Galaxy S24 Ultra, a minimum of 50% recycled cobalt was used in the battery,25 and 100% recycled rare earth elements were incorporated into the speakers.26
Galaxy S24 is also the first Galaxy S series to be designed with recycled steel and Thermoplastic Polyurethane (TPU).Galaxy S24 Ultra features a minimum of 40% recycled steel in the speakers,27 and it includes a minimum of 10% pre-consumer recycled TPU in the side and volume keys. Additionally, every Galaxy S24 comes in a packaging box made from 100% recycled paper material.
The latest flagship continues Samsung’s commitment to extending the product lifecycle, offering seven generations of OS upgrades and seven years of security updates to help users reliably experience the optimized performance of their Galaxy devices for even longer.28 Lastly, Galaxy S24 is UL ECOLOGO® certified,29 and its carbon footprint has been measured and verified by The Carbon Trust.30
Galaxy S24 is a demonstration of progress against Samsung MX’s environmental roadmap. Samsung remains steadfast in delivering on its set of goals to be achieved by the end of 2025.31 At the end of 2022, Samsung achieved the first of these goals by incorporating recycled materials in all mobile products, from Galaxy smartphones and tablets to PCs and wearables. Today, the company is setting a new recycled material goal. By 2030, Samsung will incorporate at least one recycled material in every module32 of every mobile product.
Precision Technology and Elegance in Every Detail
Galaxy S24 Ultra is the first-ever Galaxy phone to feature a titanium frame,33 enhancing device durability and longevity. Galaxy S24 Ultra’s significantly thinner body enables better on-the-go experience with more comfortable grip. On Galaxy S24+ and Galaxy S24, a streamlined One-mass design satisfies a more aesthetic standard with seamless connection between the device’s rear cover and side frame. The Galaxy S24 series comes in Earth mineral-inspired color tones. On Galaxy S24 Ultra, colors34 include: Titanium Gray, Titanium Black, Titanium Violet and Titanium Yellow. On Galaxy S24+ and Galaxy S24 colors include: Onyx Black, Marble Gray, Cobalt Violet and Amber Yellow. All three models will come with additional colors available online only.
Apple today announced iOS 17, a major release that upgrades the communications experience across Phone, FaceTime, and Messages; makes sharing even easier with AirDrop; and provides more intelligent input that improves the speed and accuracy of typing. iOS 17 also introduces new experiences with Journal, an app that makes it easy for people to practice gratitude, and StandBy, a new way to view glanceable information when iPhone is set down and charging.
“With iOS 17, we’ve made iPhone more personal and intuitive by deeply considering the features we all rely on every day,” said Craig Federighi, Apple’s senior vice president of Software Engineering. “Phone, FaceTime, and Messages are central to how we communicate, and this release is packed with updates we think our users are going to love. We’ve also reimagined AirDrop with new ways to share, autocorrect gets even better, and we’re introducing all-new experiences with Journal and StandBy, plus so much more. We can’t wait for everyone to try it.”
Your time is limited, so don't waste it living someone else's life. Don't be trapped by dogma - which is living with the results of other people's thinking. Don't let the noise of others' opinions drown out your own inner voice. And most important, have the courage to follow your heart and intuition.
Welcome, step inside and take a seat ... actually before you sit down, let me show you around. This is our new Envato headquarters in Melbourne, Australia. Situated in Melbourne’s central business district, on King St, home to a weird mix of big corporate buildings, delicious Japanese food and *ahem* gentlemen’s clubs!
Echo Three to Echo Seven. Han, old buddy, do you read me? Loud and clear, kid. What’s up? Well, I finished my circle. I don’t pick up any life readings. There isn’t enough life on this ice cube to fill a space cruiser. The sensors are placed. I’m going back. Right. I’ll see you shortly.