↩ Accueil

Vue normale

Il y a de nouveaux articles disponibles, cliquez pour rafraîchir la page.
Aujourd’hui — 2 octobre 2024MacRumors

Sonos Promises to Recommit to Software Quality and Customer Experience After App Disaster

Par : Juli Clover
2 octobre 2024 à 01:55
Sonos today announced a series of new commitments that are meant to demonstrate the company's "renewed focus" on software quality and customer experience. The announcement and an accompanying video from Sonos CEO Patrick Spence come as Sonos tries to ameliorate the negative experience customers have had with the May Sonos app update.


Going forward, Sonos says that it isn't just going to fix its app mistake, but also build a better Sonos experience. To that end, Sonos made seven pledges:

  • Unwavering focus on customer experience with ambitious quality benchmarks, and a promise to not launch products that don't meet the standards customers expect.

  • More stringent pre-launch testing with a broader range of customers to resolve issues before new software comes out.

  • No more all-at-once app releases. Any new major changes to the Sonos app will be released gradually, and customers will be able to opt-in to test new features before they become default.

  • There will be a new Quality Ombudsperson role that will give employees a clear path to raise concerns regarding quality and customer experience.

  • Home speaker products currently under warranty will have their warranty extended for an additional year.

  • App updates will come every two to four weeks to "optimize and enhance" the app experience. This includes after the current issues are fixed.

  • Sonos is establishing a Customer Advisory Board to provide feedback and insights from a customer perspective to shape and improve products before they launch.


Sonos says that its Executive Leadership Team will not accept any bonus payout for the October 2024 to September 2025 fiscal year unless Sonos is able to improve the quality of the app and rebuild customer trust.

According to Sonos, more than 80 percent of the missing features from the app have now been reintroduced, and the company expects to be at close to 100 percent in the coming weeks.

Recent reports have suggested that Sonos employees raised an alarm prior to when the redesigned Sonos app launched in May. The app was an immediate disappointment to customers because it was riddled with bugs and missing many key Sonos features, and there was significant outcry over the downgrade. Sonos was not able to roll back the changes, and has spent 2024 trying to fix the app.

Sonos has delayed new product launches to focus on software, and as a result, will miss its annual revenue target by $200 million.
Tag: Sonos

This article, "Sonos Promises to Recommit to Software Quality and Customer Experience After App Disaster" first appeared on MacRumors.com

Discuss this article in our forums

  •  

Halide Maker Does Deep Dive Into iPhone 16 Pro Camera

Par : Juli Clover
2 octobre 2024 à 01:24
Each year, the developers behind well-known iPhone camera app Halide take an in-depth look at the new camera technology that Apple has introduced. This year, Sebastiaan de With took more than 1,000 photos with the iPhone 16 Pro to examine changes to the camera setup, Apple's image processing, and more.


Apple added an upgraded 48-megapixel Ultra Wide camera to the ‌iPhone 16 Pro‌ models this year. De With found it to take photos that have "impressive sharpness," but Apple did not add a larger sensor, so you're still not going to get the level of detail that you get with the Wide camera, which has a much bigger sensor.

For macro photos, the 48-megapixel lens "does wonders" for up-close shots. In prior iPhones, the Ultra Wide was cropping in from a 12-megapixel photo, which meant you ended up with an image that was approximately three megapixels. With the 48-megapixel lens, cropping in provides a true 12-megapixel image with more detail.

As for the Main camera, which Apple now calls the "Fusion" camera, it is using a sensor that is the same physical size as the iPhone 15 Pro sensor. While both the iPhone 16 and ‌iPhone 16 Pro‌ have a "Fusion" camera, the ‌iPhone 16 Pro‌ has a larger and higher quality sensor. As with the ‌iPhone 15‌ Pro, the ‌iPhone 16 Pro‌ combines pixels and can produce better 24-megapixel and 12-megapixel images using that data, in addition to full 48-megapixel photos. Processing is about the same as last year, and there is little difference between images captured with the ‌iPhone 15‌ Pro and the ‌iPhone 16 Pro‌ with the standard Wide camera.

There are some notable internal changes that speed up image capture. The Apple Camera Interface provides faster sensor readout times for improvements to QuickTake. QuickTake also supports 4K Dolby Vision HDR, which is a noticeable improvement, according to de With. Capturing 48-megapixel ProRAW images is also a lot faster with quicker shutter speeds, and there's little delay.

For those interested in more info on everything new with the ‌iPhone 16 Pro‌ related to photography, including Camera Control, the Telephoto lens, Night mode, and more, the full Halide review is well worth a read.
Tag: Halide

This article, "Halide Maker Does Deep Dive Into iPhone 16 Pro Camera" first appeared on MacRumors.com

Discuss this article in our forums

  •  

Meta is Probably Training AI on Images Taken by Meta Ray-Bans

Par : Juli Clover
2 octobre 2024 à 00:59
Facebook parent company Meta last week added new AI features to its camera-equipped Ray-Ban Meta Glasses. You can use the camera feature on the glasses to get information about what's around you and to remember things like where you parked. There's also now support for video for AI purposes, for "continuous real-time help."


With all of these new features that involve the camera continually viewing what's around the wearer, there are new questions about what Meta is doing with that data. TechCrunch specifically asked Meta if it was using the images collected by the Meta Glasses to train AI models, and Meta declined to say.

"We're not publicly discussing that," Anuj Kumar told TechCrunch. Kumar is a senior director that works on AI wearables. "That's not something we typically share externally," another spokesperson said. When asked for clarification on whether images are being used to train AI, the spokesperson said "we're not saying either way."

TechCrunch doesn't come out and say it, but if the answer is not a clear and definitive "no," it's likely that Meta does indeed plan to use images captured by the Meta Glasses to train Meta AI. If that wasn't the case, it doesn't seem like there would be a reason for Meta to be ambiguous about answering, especially with all of the public commentary on the methods and data that companies use for training.

Meta does train its AI on publicly posted Instagram and Facebook images and stories, which it considers publicly available data. But data collected from the Meta Ray-Ban Glasses that's specifically for interacting with AI in private isn't the same as a publicly posted Instagram image, and it's concerning.

As TechCrunch notes, the new AI features for the Meta Glasses are going to be capturing a lot of passive images to feed to AI to answer questions about the wearer's surroundings. Asking the Meta Glasses for help picking an outfit, for example, will see dozens of images of the inside of the wearer's home captured, with those images uploaded to the cloud.

The Meta Glasses have always been used for images and video, but in an active way. You generally know when you're capturing a photo or video because it's for the express purpose of uploading to social media or saving a memory, as with any camera. With AI, though, you aren't keeping those images because they're being collected for the express purpose of interacting with the AI assistant.

Meta is definitively not confirming what happens to images from the Meta Glasses that are uploaded to its cloud servers for AI use, and that's something Meta Glasses owners should be aware of. Using these new AI features could result in Meta collecting hundreds of private photos that wearers had no intention or awareness of sharing.

If Meta is in fact not using the Meta Glasses this way, it should explicitly state that so customers can be aware of exactly what's being shared with Meta and what that is being used for.
Tags: Facebook, Meta

This article, "Meta is Probably Training AI on Images Taken by Meta Ray-Bans" first appeared on MacRumors.com

Discuss this article in our forums

  •  

Apple Releases New AirPods Pro 2 Beta Firmware

Par : Juli Clover
2 octobre 2024 à 00:55
Apple today released a new beta firmware update for the AirPods Pro 2, with the software available for both the Lightning and USB-C versions of the ‌AirPods Pro‌. The firmware has a build number of 7B5013c, and as it is a beta, it is only available for developers at the current time.


It is not clear what's included in the firmware update at this time, but Apple is planning to introduce hearing aid and hearing test functionality for the ‌AirPods Pro‌ 2 sometime this year.

While the current firmware is limited to developers right now, it will be released for all ‌AirPods Pro‌ 2 users in the future.
Related Roundup: AirPods Pro
Buyer's Guide: AirPods Pro (Neutral)
Related Forum: AirPods

This article, "Apple Releases New AirPods Pro 2 Beta Firmware" first appeared on MacRumors.com

Discuss this article in our forums

  •  

Adobe Debuts Photoshop and Premiere Elements 2025

Par : Juli Clover
2 octobre 2024 à 00:26
Adobe today unveiled new versions of Photoshop Elements and Premiere Elements, the company's affordable photo and video editing software aimed at users who want to enhance their photos and videos with simple editing tools.


Photoshop Elements 2025 incorporates multiple new AI-powered tools to make editing quicker than ever. There's a new Remove tool that lets users select an area in the image to remove with a brush, plus it is accompanied by an Object Removal Guided Edit.

There's also an option to change the color of objects in an image by selecting them with the automatic selection tools and then choosing a new color.


A new Depth Blur filter adds blur to images to mimic a depth of field effect, with controls for adjusting blur strength, focal distance, and focal range. It's a useful way to change the focus of an image.

There is a Combine Photos Guided Edit that walks users through blending the subject from one image and the background from another to create an all-new image. It can use parts of multiple photos for unique looks.

Other new features include options for textured photo backgrounds and graphics and a one-click option for adding effects like camera motion, animated sparkles, or an animated frame.

Premiere Elements 2025 has several new features for videos. There's a new White Balance tool for adjusting the look of clouds, snow, and similar white elements, plus there are new color correction Curves for making more precise color and brightness adjustments.


There are additional templates for dynamic titles with more control over text alignment, size, color, and spacing. There are preset LUTs for tweaking color, and a simplified Timeline speeds up editing.

For both Photoshop and Premiere Elements, there are enhancements for Macs with the new M3 chip, so the software will run faster on new M3 machines.

Photoshop and Premiere Elements are priced at $100 each individually, with a bundle available for $150. More information is available on Adobe's website.
Tag: Adobe

This article, "Adobe Debuts Photoshop and Premiere Elements 2025" first appeared on MacRumors.com

Discuss this article in our forums

  •  

Microsoft Discontinuing Mixed Reality HoloLens 2 Headset

Par : Juli Clover
2 octobre 2024 à 00:07
Microsoft is planning to discontinue its mixed reality HoloLens 2 headsets, according to a report from UploadVR. Production on the HoloLens 2 is ending, and sales will cease when stock runs out.


Security updates will be provided until December 31, 2027, but after that point, Microsoft plans to end software support for the HoloLens 2.

Microsoft was one of the first companies to delve into mixed reality technology, and it introduced the original HoloLens in 2016, following up with the HoloLens 2 in 2019. The HoloLens headsets have always been expensive and Microsoft has targeted them to enterprise customers rather than the general public.

At the current time, Microsoft does not appear to have plans for another HoloLens headset. There were rumors of a version three back in 2022, but work was reportedly canceled due to a lack of focus and internal hardware development challenges. Microsoft has also been downsizing its mixed reality team in 2023 and 2024.

Microsoft does apparently plan to continue supporting its HoloLens IVAS, which stands for integrated visual augmentation system. It is an AR headset that Microsoft is creating for the U.S. Army, and it is set to be tested in early 2025 to determine its feasibility for full-scale production.

As Microsoft has been winding down its work on the HoloLens, it has partnered with Meta to bring Xbox Cloud Gaming and its Office apps to the Quest headsets, and it is also working on Windows 11 integration with the Quest.

Apple's Vision Pro headset has been marketed to both consumers and enterprise customers unlike the HoloLens, but it currently shares some of the same shortcomings, such as a high price tag. Apple is not yet ready to abandon mixed reality, and there is another version of the Vision Pro in the works. A second AR/VR headset could come as soon as 2025.
This article, "Microsoft Discontinuing Mixed Reality HoloLens 2 Headset" first appeared on MacRumors.com

Discuss this article in our forums

  •  

Juno YouTube App for Vision Pro Removed From App Store

Par : Juli Clover
1 octobre 2024 à 23:38
Juno, an app designed for watching YouTube on the Vision Pro, has been removed from the App Store, developer Christian Selig said today. Back in April, YouTube emailed Selig and said that Juno was violating the YouTube Terms of Service and the YouTube API by modifying the native YouTube.com web user interface, and used YouTube trademarks and iconography that could be confusing to customers.


In response, Selig switched from using the embed player to the website player, made it clear that Juno was an unofficial YouTube viewer, and explained to YouTube that as a web viewer, Juno is not using YouTube APIs. At the same time, though, YouTube filed a complaint with the ‌App Store‌, and Selig went on to warn customers that he would not fight Google on any decision regarding Juno.

Juno has now been removed from the ‌App Store‌ by Apple in response to YouTube's complaint. Selig says that he does not agree with the decision because Juno is a simple web view and that that modifies CSS to make the player look more "visionOS like," but he does not plan to appeal the decision.


Selig, for those unaware, was the developer of the Reddit app Apollo, and he faced a public fight with Reddit over its third-party API changes and fees last year. The dispute ultimately ended up with Apollo shutting down. According to Selig, Juno was just a fun hobby project.
Juno was a fun hobby project for me to build. As a developer I wanted to get some experience building for the Vision Pro, and as a user I wanted a nice way to watch YouTube on this cool new device. As a result, I really enjoyed building Juno, but it was always something I saw as fundamentally a little app I built for fun.

YouTube does not have a dedicated app for the Vision Pro, which is why Selig designed and released Juno last February. Prior to when the Vision Pro launched, YouTube said that it would not develop a Vision Pro app, nor would it allow the YouTube iPad app to run on the headset. With Juno removed, those who want to watch YouTube on Vision Pro will need to use Safari.

Juno for YouTube was priced at $4.99, and Selig says that customers who purchased the app should still be able to use it even though it's been removed from the ‌App Store‌.
Related Roundup: visionOS 2
Tag: YouTube
Related Forum: Apple Vision Pro

This article, "Juno YouTube App for Vision Pro Removed From App Store" first appeared on MacRumors.com

Discuss this article in our forums

  •  

Home Depot Launches Smart Glass Doors That Turn From Clear to Opaque

Par : Juli Clover
1 octobre 2024 à 23:15
Home Depot today launched new Smart Glass Door options that are able to transform from clear glass to opaque glass with a smartphone app, providing versatility for customers who want to be able to block light or people from seeing inside a home at select times of the day.


The Smart Glass Doors are made by Feather River Doors and are powered through Hubspace, a smart home app that integrates with several products sold by Home Depot, including light bulbs, ceiling fans, blinds, outlets, and locks.

The Smart Glass Doors can be controlled with an app that's available on the iPhone, but there is no HomeKit integration or support for Siri, which is a major downside for people who have a ‌HomeKit‌ setup. There is, however, support for Alexa and Google voice control.

When activated, the glass in the door is able to change from clear to privacy, which is a more opaque look. The glass is laminated for energy efficiency, and the door itself is made from fiberglass filled with high density polyurethane foam. It is unpainted, so it can be painted to match a home.

There are several models with different amounts of glass, with pricing that starts at $800. The doors will be available in select Home Depot stores starting on October 28.
This article, "Home Depot Launches Smart Glass Doors That Turn From Clear to Opaque" first appeared on MacRumors.com

Discuss this article in our forums

  •  

Apple Celebrating 'Mindful Month' With October 10 Apple Watch Activity Challenge

Par : Juli Clover
1 octobre 2024 à 23:02
Apple will hold a new "Mindful Month" Apple Watch Activity Challenge that is set to take place on October 10. It is meant to bring awareness to caring for mental health.


Let's bring awareness to all the ways we can take care of our mental health. On October 10, record 10 mindful minutes with any app that adds to Health to get this award.

Like all Activity Challenges, the Mindful Month event will come with animated stickers that can be used in the Messages app.








Mindful Month appears to be a new addition to the Apple Watch Activity Challenge lineup. Apple's last Activity Challenge took place in August for National Park Day.
This article, "Apple Celebrating 'Mindful Month' With October 10 Apple Watch Activity Challenge" first appeared on MacRumors.com

Discuss this article in our forums

  •  
Hier — 1 octobre 2024MacRumors

Apple Seeds Third visionOS 2.1 Update to Developers

Par : Juli Clover
1 octobre 2024 à 19:11
Apple today seeded the third beta of an upcoming visionOS 2.1 update to developers for testing purposes, with the new software coming a week after Apple seeded the second visionOS 2.1 beta.


Registered developers are able to opt into the betas by opening up the Settings app on their device, going to the Software Update Section, tapping on the "Beta Updates" option, and toggling on the Developer Beta. Note that an Apple ID associated with a developer account is required to download and install the beta.

There's no word yet on what's included in visionOS 2.1, but there are visionOS 2 features that Apple has not yet released, such as the option to use a larger ultrawide screen for the Mac Virtual Display and support for Multiview for MLS and MLB games.
Related Roundup: visionOS 2
Related Forum: Apple Vision Pro

This article, "Apple Seeds Third visionOS 2.1 Update to Developers" first appeared on MacRumors.com

Discuss this article in our forums

  •  

Apple Seeds Third Beta of watchOS 11.1 to Developers

Par : Juli Clover
1 octobre 2024 à 19:11
Apple today seeded the third beta of an upcoming watchOS 11.1 update to developers for testing purposes. The third beta comes a week after Apple released the second watchOS 11.1 beta.


To install the ‌‌watchOS 11‌.1 update, developers need to open the Apple Watch app, go to the Software Update section under "General" in Settings, and toggle on the ‌‌watchOS 11‌‌ Developer Beta. An Apple ID linked to a developer account is required.

Once beta updates have been activated, ‌‌watchOS 11‌‌.1 can be downloaded under the same Software Update section. To install software, an Apple Watch needs to have 50 percent battery life and it must be placed on an Apple Watch charger.

It's not yet clear what new features are included in the watchOS 11.1 update at this time, as the update primarily focuses on Apple Intelligence, and Apple Intelligence features are not available on watchOS.
Related Roundup: watchOS 11
Related Forum: Apple Watch

This article, "Apple Seeds Third Beta of watchOS 11.1 to Developers" first appeared on MacRumors.com

Discuss this article in our forums

  •  

Apple Seeds Third tvOS 18.1 Beta to Developers

Par : Juli Clover
1 octobre 2024 à 19:10
Apple today seeded the third beta of an upcoming tvOS 18.1 beta to developers for testing purposes, with the new software coming a week after Apple seeded the second tvOS 18.1 beta.


Registered developers are able to download the tvOS 18.1 update by opting in to the beta through the Settings app on the Apple TV. A registered developer account is required.

tvOS software releases are usually minor in scale compared to other operating system updates, focusing primarily on smaller improvements rather than outward-facing changes. We don't know what's included in tvOS 18.1.

Apple shares some information on tvOS releases in its tvOS support document, which is updated after each tvOS launch, but Apple does not provide notes during beta testing.

Though we don't always know what's new in tvOS betas, we let MacRumors readers know when new updates are available so those who are developers can download new software upon release.
Related Roundup: Apple TV
Buyer's Guide: Apple TV (Don't Buy)

This article, "Apple Seeds Third tvOS 18.1 Beta to Developers" first appeared on MacRumors.com

Discuss this article in our forums

  •  

Halide App Gains Added iPhone 16 Camera Control Functionality

Par : Juli Clover
1 octobre 2024 à 18:39
Popular camera app Halide was today updated with new features for the Camera Control button available on the new iPhone 16 models. Halide already supported opening the app with Camera Control, but now users can also make adjustments.


Using the Camera Control's touch and swipe-based gesture support, Halide users can adjust focus and exposure, and lock their settings in place. The dedicated "Locked" adjustment makes sure that no settings can be disturbed from accidental swipes on the Camera Control button, and it is designed for people who do not want to use manual adjustments.

The Exposure setting allows users to set an exposure bias that exceeds what's possible with Apple's Camera app (-/+ 6 EV vs. 2), and Focus allows for manual focusing on a subject or scene using the Camera Control button. Halide says that Focus provides a smooth manual focus experience with the iPhone.

In addition to these changes, Halide has added a quicker way to capture a photo. Halide takes the photo when the Camera Control button is pressed, while the Camera app from Apple takes the image when the button is released. The difference is minor, but fractions of a second can sometimes make a difference.

Halide says that the quick snap feature works well with the 48-megapixel ProRAW Zero Shutter Lag feature in the app, as well as with Process Zero, an option that Halide added to take 12-megapixel images with no AI and minimal processing.

Halide can be downloaded from the App Store. The app is priced at $2.99 per month, $19.99 per year, or $59.99 for a one-time purchase. [Direct Link]
Tag: Halide

This article, "Halide App Gains Added iPhone 16 Camera Control Functionality" first appeared on MacRumors.com

Discuss this article in our forums

  •  

Customizing Your iPhone 16 Images With Photographic Styles

Par : Juli Clover
1 octobre 2024 à 00:53
One of the major new software-based camera features in the iPhone 16 models is support for a wider range of Photographic Styles, complete with more granular controls to make for a setup where you can create a customized look for all of your images.


Photographic Styles isn't a feature that's quite as easy to use as something as simple as a filter, so we thought we'd delve into how it works and how to get the most from it.

Available Photographic Styles


Photographic Styles aren't new, but with prior iPhone models, there were only four options: Rich Contrast, Vibrant, Warm, and Cool. On the ‌iPhone 16‌, there are several more pre-set styles to choose from.

  • Cool Rose

  • Neutral

  • Rose Gold

  • Gold

  • Amber

  • Standard (No edits)

  • Vibrant

  • Natural

  • Luminous

  • Dramatic

  • Quiet

  • Cozy

  • Ethereal

  • Muted Black and White

  • Stark Black and White


All of the styles have varying Tone, Color, and Palette settings, that correspond to brightness, saturation, and effect intensity.

How Photographic Styles Work


According to Apple, on the ‌iPhone 16‌ models, Photographic Styles adjust specific colors in select parts of your photos to adjust the overall look.

The first five Photographic Styles are tuned for skin undertones, including Cool Rose, Neutral, Amber, Rose Gold, and Gold. Cool Rose accentuates cool pinkish undertones, while neutral neutralizes warm undertones. Amber, Rose Gold, and Gold accentuate those specific tones. These can be fairly subtle, depending on the settings you choose.

Other Styles are closer to what you get with a filter, adding more dramatic effects that impact the mood of the image.

Setting Up and Customizing Your Style


When you've taken at least four photos with the ‌iPhone 16‌ camera, you can go to Settings > Camera > Photographic Styles to set the base tone that you want to use for all of your images.

You can pick from the skin tone-focused options, which include Cool Rose, Neutral, Amber, Rose Gold, and Gold. You'll see the different effects across four images, and you can adjust the intensity to begin with.

After you've selected a favorite undertone, you can further refine the look by dragging a finger over the touchpad, which changes the brightness and saturation. The slider changes the overall intensity.

Once you've set your Photographic Style, it's automatically applied to all of your images and it is the default value for your photos.

If you want to turn it off, you can go to Settings > Camera > Photographic Styles > Reset to Standard.

Real-Time Previews


In the Camera app when you go take a photo, if you tap on the touchpad icon, you can select a different undertone or mood style. Swiping changes the Photographic Style, and the controls below can be used to customize the look.

The option to use customized Photographic Styles in real-time lets you preview what an image will look like with different effects before you even take it.

You can access Photographic Styles from the Camera Control button too. Press Camera Control once to open the Camera app, then light press to bring up the Camera Control menu. Swipe until you get to Styles or Tone, then light press again to select it. From there, you can make adjustments by swiping. To get back to the menu to select another option, use a light double press.

Editing After a Shot


You can add or adjust a Photographic Style after an image has been captured, so it's not something that you need to do in the moment. This is the first time that Apple has allowed Photographic Styles to be edited after the fact - earlier versions of this feature only allowed the Style to be applied when taking an image.

To edit a Photographic Style, go to the Photos app, tap on the three bars to enter the editing interface, and then tap on Styles. You can select any of the styles and then adjust it using the touchpad.

The touchpad's X axis adjusts color, while the Y axis adjusts tone. The slider adjusts overall intensity or Palette. A Tone setting of 0, a Color setting of 0, and a Palette setting of 0 result in a "Standard" photo that has no Photographic Style applied, so that's a good starting point to better understand exactly what each style is changing.

Adjusting the Color to the left desaturates, while dragging it to the right deepens color. Dragging Tone up makes the image brighter, while dragging it down makes it darker.

You can change the Photographic Style setting at any point, and it is a non-destructive edit so it's not permanent. If you want to get back to a normal, unedited photo, choose the Standard setting.

Photographic Styles are entirely distinct from the ‌Photos‌ app Adjust section where you can tweak exposure, brilliance, highlights and shadows, contrast, brightness, saturation, vibrance, warmth, tint, and more.

HEIF Only


If you have your ‌iPhone‌ set to take JPG images instead of HEIF, you won't be able to use Photographic Styles. You need to have HEIF turned on. In the Camera section of the Settings app, HEIF can be enabled by going to Formats and choosing "High Efficiency" instead of "Most Compatible."

Preserve Settings


By default, the Photographic Style that you set up using the Settings app will be used automatically. If you select a different Photographic Style when you're taking a photo, it will use that only until you close the Camera app.

If you want the Photographic Style you selected in the Camera app to be the new default, you need to go to Settings > Camera > Preserve Settings and toggle on Photographic Style. With this turned on, your ‌iPhone‌ will preserve your last used Photographic Style rather than reset to your default.

Photographic Styles and Older iPhones


Older iPhones do not support the new touchpad and adjustment settings for Photographic Styles, but if you take an image on an ‌iPhone 16‌ and then edit it on an older phone, you can see the touchpad to make further adjustments.
Related Roundups: iPhone 16, iPhone 16 Pro

This article, "Customizing Your iPhone 16 Images With Photographic Styles" first appeared on MacRumors.com

Discuss this article in our forums

  •  

Apple Releases Safari Technology Preview 204 With Bug Fixes and Performance Improvements

Par : Juli Clover
1 octobre 2024 à 00:25
Apple today released a new update for Safari Technology Preview, the experimental browser that was first introduced in March 2016. Apple designed ‌Safari Technology Preview‌ to allow users to test features that are planned for future release versions of the Safari browser.


‌Safari Technology Preview‌ 204 includes fixes and updates for Accessibility, CSS, Forms, JavaScript, Service Workers, Web API, Web Driver, Web Extensions, and Web Inspector.

The current ‌Safari Technology Preview‌ release is compatible with machines running macOS Sonoma and macOS Sequoia, the newest version of the Mac operating system.

The ‌Safari Technology Preview‌ update is available through the Software Update mechanism in System Preferences or System Settings to anyone who has downloaded the browser from Apple's website. Complete release notes for the update are available on the Safari Technology Preview website.

Apple's aim with ‌Safari Technology Preview‌ is to gather feedback from developers and users on its browser development process. ‌Safari Technology Preview‌ can run side-by-side with the existing Safari browser and while it is designed for developers, it does not require a developer account to download and use.
This article, "Apple Releases Safari Technology Preview 204 With Bug Fixes and Performance Improvements" first appeared on MacRumors.com

Discuss this article in our forums

  •  

Epic Games Accuses Samsung and Google of Colluding to Prevent Sideloading on Galaxy Devices

Par : Juli Clover
30 septembre 2024 à 23:32
Fortnite creator Epic Games today filed a lawsuit against Google and Samsung, accusing the two companies of anticompetitive behavior that discourages Android users from downloading games and apps outside of the Google Play Store.


At issue is a Samsung feature called Auto Blocker, which is designed to prevent Galaxy devices from installing applications from unauthorized sources. Enabled by default, Samsung says that Auto Blocker keeps users safe from unknown apps and malware, but it does disable sideloading.

With Auto Blocker, when users attempt to install an app from an unknown or unauthorized source, they'll receive a pop-up alert letting them know that installation was prevented. Auto Blocker can be overridden during the setup of a Galaxy device, and there is also an option to temporarily remove it. Auto Blocker was first introduced last October, and ‌Epic Games‌ says that the feature is in fact a "coordinated effort" to block competition in app distribution.

"Auto Blocker cements the Google Play Store as the only viable way to get apps on Samsung devices, blocking every other store from competing on a level playing field," says ‌Epic Games‌.

‌Epic Games‌ is accusing Samsung and Google of creating Auto Blocker with the purpose of undermining the result of the Epic Games v. Google lawsuit, where a nine-member jury agreed that Google had an app store monopoly and that Google's agreements with OEMs were anticompetitive.

When speaking to journalists earlier today (via The Verge), ‌Epic Games‌ CEO Tim Sweeney said that he did not have proof that Google and Samsung had colluded on the Auto Blocker feature, but he is hoping to find evidence during the document discovery process. He also did not ask Samsung to make ‌Epic Games‌ an authorized source for games.

Further, Sweeney claimed that Epic was filing the lawsuit on behalf of all developers, and not just to get ‌Epic Games‌ special treatment. "If we'd fought Epic v. Apple and Epic v. Google solely on the basis of Epic getting special privileges, perhaps settlement discussions with Apple and Google might have been fruitful," said Sweeney. "But if we did that, we'd be selling out all developers."

Evidence from the Epic v. Apple trial suggests that Sweeney did, at that time, seek a special deal with Apple that would not have been available to all developers. In 2020, Sweeney wrote a letter to Apple asking for permission to add support for third-party payment processors in Fortnite, and it's only when Apple said no that Epic filed a lawsuit against Apple. When questioned about this letter at trial, Sweeney confirmed that he was seeking a special deal for Fortnite and would have accepted it even if Apple didn't offer the same deal to other developers.

Sweeney has suggested multiple times that the lawsuits Epic is filing against Apple, Google, and Samsung are for all developers, but realistically, ‌Epic Games‌ is looking out for its own interests.

‌Epic Games‌ is aiming to have the court force Samsung to remove Auto Blocker as a default device setting. Samsung in a statement to The Verge said that Auto Blocker is a security and privacy feature that users can disable at any time. "We plan to vigorously contest Epic Game's baseless claims," said Samsung. Google called the lawsuit "meritless."
This article, "Epic Games Accuses Samsung and Google of Colluding to Prevent Sideloading on Galaxy Devices" first appeared on MacRumors.com

Discuss this article in our forums

  •  
À partir d’avant-hierMacRumors

AirPods 4 With ANC vs. AirPods Pro 2

Par : Juli Clover
28 septembre 2024 à 16:02
Apple last week released the AirPods 4, and one version of the new earbuds has Active Noise Cancellation included. ANC means the ‌AirPods 4‌ have a feature set that rivals the AirPods Pro 2, so we thought we'd compare the two for those undecided on which to get.


The ‌AirPods 4‌ are Apple's first open-ear earbuds to include ANC, and the major difference between the ‌AirPods Pro‌ 2 and the ‌AirPods 4‌ is the silicone tips. The ‌AirPods Pro‌ 2 have silicone tips for a tighter seal in the ear, while the ‌AirPods 4‌ don't.

You're not going to get the same level of ANC with the ‌AirPods 4‌ that you get with the ‌AirPods Pro‌ 2 because there's no sealing mechanism to block out noise, but the ‌AirPods 4‌ still perform impressively well. The new earbuds are able to cut down on plane and road noise even without the tight ear seal.

Sound quality is about the same because the ‌AirPods 4‌ now have the same H2 chip as the ‌AirPods Pro‌ 2. The ‌AirPods 4‌ have a much smaller case than the ‌AirPods Pro‌ 2, but both cases have a speaker that can play a sound when you need to track them down using Find My. MagSafe wireless charging is exclusive to the ‌AirPods Pro‌ 2, though. The ‌AirPods 4‌ with ANC can wirelessly charge, but not with the strong magnetic connection available with the ‌AirPods Pro‌.

There are a few other differences that are worth knowing about, which we go over in the video above.

Choosing between the ‌AirPods Pro‌ 2 and the ‌AirPods 4‌ with ANC mostly comes down to fit. For some people, the silicone tips for the ‌AirPods Pro‌ 2 are more comfortable, and others find an open-ear design to be better for long wear. Price is also a factor, as the ‌AirPods Pro‌ 4 with ANC are $179 and the ‌AirPods Pro‌ 2 are $249. Of course, if you don't need ANC at all, you can get the ‌AirPods 4‌ without ANC for $129.

Let us know which earbuds you prefer in the comments below.
Related Roundups: AirPods 4, AirPods Pro
Related Forum: AirPods

This article, "AirPods 4 With ANC vs. AirPods Pro 2" first appeared on MacRumors.com

Discuss this article in our forums

  •  

Best Ways to Use the iPhone 16 Action Button

Par : Juli Clover
27 septembre 2024 à 23:03
With the iPhone 16 lineup, Apple brought the Action Button to all four devices, expanding it from the Pro-only limitation last year. At the same time, there's a new Camera Control button that eliminates the need to activate the camera with the Action Button, which was one of the major useful functions. At the same time, there are new Control Center options that you can set to the Action Button, expanding what's possible.


This guide goes over what you can do with the Action Button with an ‌iPhone 16‌ and iOS 18, and it may be useful to help you find something new to use it for.

Base Functions


Since the Action Button has been around since last year, Apple has several base functions that could be assigned to it.


  • Silent Mode - This toggles Silent Mode on and off, and it's the one-to-one replacement for the prior mute switch. This is useful if you often want to turn on sound and have the option to turn it off, but if you're a silent all the time person, it's not that functional.

  • Focus - You can set the Action Button to toggle on any Focus mode that you've set up. This is a good option if you've got a Do Not Disturb type of Focus that you like to turn off and on throughout the day. Of course, Focus modes can also be set to turn on and off at specific times instead, so that's able to be automated in other ways.

  • Camera - If you have an ‌iPhone 16‌ with the Camera Control button, there's no need to set the Action Button to open the Camera, unless you want it to do something like open the selfie camera while the Camera Control button opens the rear camera. It's easy to swap camera modes from the Camera Control button or the Camera app once it's open, though.

  • Flashlight - Flashlight is potentially one of the more useful Action Button settings if you regularly use the Flashlight function in the dark. Prior to ‌iOS 18‌, the Flashlight was a mandatory Lock Screen button, but that's no longer the case. You can set other functions to the Lock Screen now, so it can make more sense to move the Flashlight to the Action Button depending on how often you use it.

  • Voice Memos - Setting the Action Button to Voice Memo starts a recording when you press the button the first time, and stops it with a second press. In ‌iOS 18‌, you can get transcriptions of Voice Memos, which adds a lot more functionality. If you want to record class lectures, interviews, or meetings regularly, setting Voice Memos to the Action Button might be your best bet.

  • Recognize Music - This basically activates Shazam to tell you what music is playing around you. Unless you're using Shazam all the time, this is probably better to activate from Control Center.

  • Translate - When you press the Action Button with Translate set, it'll automatically listen to what's being said and then provide a translation. You'll need to select your languages in the Translate app, but after that, it doesn't open up a full app. It's a quick access interface where you can get a translation and even have that translation spoken aloud. There are some limitations on languages, but if you're traveling to a country where one of the available languages is spoken, this one's a super useful way to take advantage of the Action Button.

  • Magnifier - Magnifier opens up the Camera app and lets you set a zoom level so you can magnify small text. You can change brightness and contrast for better viewing, and put on the flash if it's dark. If you have eyesight issues and trouble with text that's too small, Magnifier has the potential to be helpful, and you'll probably need it often enough to justify assigning it to the Action Button.

  • Accessibility - You can set the Action Button to any Accessibility feature. There are a long list of options, but some of the more useful ones to use with a quick access toggle include Zoom, VoiceOver, Voice Control, Apple Watch Mirroring, Background Sounds, Conversation Boost, Live Speech, and Guided Access.

  • No Action - Don't want to use the Action Button? Setting it to No Action means it won't do anything when it's pressed.


Shortcuts


You can set any Shortcut to be activated with the Action Button using the "Shortcuts" setting, and that's how some people get the most out of the feature. You can create a Shortcut that brings up several different apps and functions.

The "Super Action Button" shortcut, for example, opens up a menu where you can choose from options like taking a screenshot, turning on the flashlight, creating a Reminder, starting a Voice Memo, opening Apple Maps, creating a Calendar event, scanning a document, and more.

You can find a bunch of these online on Reddit or the MacRumors forums, or you can create your own with the functions that you want to access quickly.

Third-party apps that have Shortcuts created by developers will also show up in the Action Button Shortcuts section, so if you want to have the Action Button do something like open a book in Audible or create a to-do in Things, you would set that up with the Shortcuts app. Here are a few first and third-party app Shortcuts that might be useful:

  • ChatGPT - Ask ChatGPT a question that you type in, or start a voice conversation.

  • Audible - Read a book or set a sleep timer.

  • Clock - Set a timer.

  • Files - Scan a document.

  • Google - Start a Google search or a voice search.

  • Music - Play music from your Apple Music library or a radio station.

  • Phone - Call or FaceTime someone.

  • Podcasts - Play a podcast.

  • Things - Add a to-do.

  • Fantastical - Create an event.

  • Remote - Activate the Remote Control feature for Apple TV.

  • Open an app - Set the Action Button to open any app you have installed.


What you have available for the Shortcuts Action Button option will depend on which apps you have installed and which Siri Shortcut features they've implemented. Note that this setting is distinct from the Control Center controls that you can also assign to the Action Button.

To make things more confusing, there are different app actions in the Shortcuts app that aren't available in the Action Button settings unless you've previously set them up. You can, for example, have the Action Button launch an Amazon search, but only if you have set up a Shortcut for that function.

So if there's something that you want to do that you're not seeing from the Shortcuts interface in the Action Button Settings, head over to the Shortcuts app, tap on the "+" button and go through the different app options there. If you find an app feature you want to use, like activating a Hue lighting scene, set that as a Shortcut and then you can assign it to the Action Button.

Along with these simple app options that you already have available, you can download any Shortcut from the Shortcuts Gallery or the internet and add that to the Action Button.

Controls


So in ‌iOS 18‌, Apple opened up the Control Center to third-party app developers, and also added more first-party Control Center options. Some of those new Control Center features can be set to the Action Button, and you'll find them under the Controls section when you're deciding on a function for your Action Button.


There are some Control Center controls that just aren't available. There's no option for turning on Low Power Mode with a Control Center option for the Action Button. You can set the Action Button to activate Low Power Mode, but you need to do it with Shortcuts.

You can also do things like turn off Cellular connectivity, but not Wi-Fi, even though both of those are Control Center toggles.

The distinction between Shortcuts and Controls is pretty confusing, especially with arbitrary restrictions like that. Some of the available Control Center toggles:

  • Activate Dark Mode

  • Open the Timer interface

  • Scan a Code

  • Open an app like Instagram or Halide to the camera

  • Toggle on Airplane mode

  • Turn off cellular data

  • Open the Home app

  • Start a Quick Note


Third-party controls are also present, and a lot of them mirror what you can do with their Shortcuts. But some apps might have Control Center controls and not Shortcuts, or there may be differences between what's available. Shortcuts generally have more options available.

Lock Screen Controls


Keep in mind that you can also set different Control Center controls to the Lock Screen now since the Camera and Flashlight options can be swapped out. It might make more sense to set a Control Center action to the Lock Screen so you can free up the Action Button for something else.


Share Your Action Button Setup


What do you use the Action Button for? Let us know in the comments below if you've come up with something clever.
Related Forums: iOS 18, iPadOS 18

This article, "Best Ways to Use the iPhone 16 Action Button" first appeared on MacRumors.com

Discuss this article in our forums

  •  

iPhone 16 Pro Max: One Week Camera Review

Par : Juli Clover
27 septembre 2024 à 20:10
It's been a full week since the new iPhone 16 models launched, and we've now had enough time to give the Camera Control button and the camera setup a more in-depth look. We tested the iPhone 16 Pro and Pro Max, which have triple-lens rear camera setups with 48-megapixel Fusion lens, 48-megapixel Ultra Wide lens, and 5x Telephoto lens.


The Camera Control button has turned out to be great for quickly launching the camera and snapping photos and videos, but the more in-depth adjustments that require light presses and swiping are harder to get used to.

It's still quicker to get to these controls in the Camera app itself by tapping on the display. Maybe that's a matter of getting used to the new setup, but it's also harder to use the button in portrait orientation than landscape mode, and a lot of people take photos in portrait orientation these days. On the ‌iPhone 16 Pro‌ Max especially, the button is just too low to be comfortable to use when holding the device vertically.

New this year is the 48-megapixel Ultra Wide lens, and it's a big improvement over the prior 12-megapixel Ultra Wide lens. Apple is using pixel binning to combine four pixels into one, ultimately providing a 12-megapixel finished image, but merging pixels allows for more detail and improved quality in low light.

The 5x Telephoto lenses haven't changed, but you can get 5x optical zoom on the 16 Pro in addition to the Pro Max, which is new this year. The "Fusion" camera is also basically identical to the "Main" camera from last year, and you still have the settings to choose between 24mm, 28mm, and 35mm lenses.

Photographic Styles are a great addition for tweaking the look of an image, and the pad for adjustments is useful to get a unique mood with minimal effort. Make sure to check out our video above to see it in action.

For videographers, the option to shoot in 4K 120 fps is super useful, especially because you can slow it down or speed it up in post production. The Audio Mix feature for changing the audio also really makes a difference for capturing video.

We have some ‌iPhone 16 Pro‌ camera comparisons coming, where we pit the ‌iPhone 16 Pro‌ against some other flagship smartphones. Stay tuned to MacRumors to see those videos next week.
Related Roundup: iPhone 16 Pro

This article, "iPhone 16 Pro Max: One Week Camera Review" first appeared on MacRumors.com

Discuss this article in our forums

  •  

MacRumors Giveaway: Win an iPhone 16 Pro From Lululook

Par : Juli Clover
27 septembre 2024 à 19:18
For this week's giveaway, we've teamed up with Lululook to offer MacRumors readers a chance to win one of Apple's new iPhone 16 Pro models. For those unfamiliar with Lululook, it is a company that makes a wide range of Apple accessories.


For charging an iPhone, Apple Watch, and AirPods, Lululook has the $80 Ultra-Rise Qi2 Charging Station. It can charge a MagSafe-compatible ‌iPhone‌ at up to 15W, while also providing enough power to fast charge an Apple Watch. Both the Apple Watch and AirPods can charge at 5W.

The Charging Station has a compact design with an upright adjustable Qi2 charging platform, which supports an ‌iPhone‌ in portrait or landscape orientation so you can use StandBy. The Apple Watch charger is tucked behind the ‌iPhone‌ charger, and the AirPods charge at the base of the device.


If you're looking for a new ‌iPhone‌ wallet option, Lululook offers a $36 Leather Magnetic Wallet Stand that combines a card wallet with a handy pull-out stand that you can use to watch videos, make calls, and more. It holds three cards, and it comes in black, dark brown, or brown. There's a strong magnet inside, but it's easy to detach for quick card access.


The new iPhone 16 models require a 20W or higher charger for fast charging with USB-C or 30W for fast charging over MagSafe, and Lululook has the perfect 65W three-port GaN charger option. It includes two USB-C power delivery ports and a USB-A port for older accessories. The two USB-C ports split power in 30W and 30W increments, perfect for the ‌iPhone‌.


Lululook ships the charger with a Universal Travel Converter so it can be used in the UK, EU, and Australia in addition to the United States, which also makes it ideal for travel. For those interested in any of Lululook's accessories, there is a sitewide September sale right now, just use the code 15OFF at checkout.

We have a new 128GB ‌iPhone 16 Pro‌ for one lucky winner, with the winner able to choose the color. To enter to win, use the widget below and enter an email address. Email addresses will be used solely for contact purposes to reach the winner(s) and send the prize(s). You can earn additional entries by subscribing to our weekly newsletter, subscribing to our YouTube channel, following us on Twitter, following us on Instagram, following us on Threads, or visiting the MacRumors Facebook page.

Due to the complexities of international laws regarding giveaways, only U.S. residents who are 18 years or older, UK residents who are 18 years or older, and Canadian residents who have reached the age of majority in their province or territory are eligible to enter. All federal, state, provincial, and/or local taxes, fees, and surcharges are the sole responsibility of the prize winner. To offer feedback or get more information on the giveaway restrictions, please refer to our Site Feedback section, as that is where discussion of the rules will be redirected.

The contest will run from today (September 27) at 9:00 a.m. Pacific Time through 9:00 a.m. Pacific Time on October 4. The winner will be chosen randomly on or shortly after October 4 and will be contacted by email. The winner will have 48 hours to respond and provide a shipping address before a new winner is chosen.
This article, "MacRumors Giveaway: Win an iPhone 16 Pro From Lululook" first appeared on MacRumors.com

Discuss this article in our forums

  •  
❌
❌