Friday, February 23, 2024
Smartphone news

Google rolls out Night Sight mode to Pixel smartphones – Tech News


Google is now rolling out an update to the Pixel Camera app, bringing a new Night Sight mode to the three generations of Pixel smartphones for use on the main and selfie cameras. 

This new feature promises clean, sharp photos in low-light conditions, with no need for a flash or tripod. 

This new camera feature from Google dynamically adapts to the environment before the user even takes their picture. The smartphone automatically suggests using the Night Sight mode when sufficiently dark conditions are detected. The user simply presses the onscreen suggestion to activate the function. 

After pressing the shutter button, it is advised to keep the camera as still as possible until the phone has finished taking the photo. If the smartphone stays relatively stable, then this new mode will increase exposure time in order to capture as much light as possible and limit noise in the image. 

However, if the Pixel handset is moved around a lot or the scene being photographed is itself moving, then exposure time is reduced to capture less light and so reduce blurring caused by movement. Google explains that the Night Sight mode actually takes several photos in a burst of frames then reassembles them to obtain the sharpest possible and best-lit image. The mode has been designed to adapt to all situations, with no need for users to adjust settings manually. 

Google makes image quality a priority in its smartphones. The latest Pixel 3 has a host of innovative functions, such as Top Shot, a tool that automatically selects the best shot from a short burst of frames – some taken just before the user hits the shutter button. Thanks to machine learning, the camera can pre-select images in which people smile and have their eyes open. A Playground mode also lets users pose for photos with favourite Star Wars or Avengers characters. 

The Google Pixel 3 is out now priced from US$799 (RM3,349). – AFP Relaxnews





READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.