Google sells laptops and seems to be getting in the tech game at every turn. They have recently come out with the Google CLips, which is supposed to be a smart camera that picks up the important events in your life and photographs them. It’s a $250 smart camera that uses artificial intelligence, (AI) to detect and capture important moments in your life.
If you’re a parent or a pet owner, you’re probably familiar with the frustration of trying to capture great candids of your kids’ or pets’ activities. Or maybe you’re familiar with having a ton of photos and videos of your family members, but you’re not in any of them because you’re the one taking the pictures. Google thinks Clips can solve that frustration, by providing a camera that can do all of the shooting for you. The only problem is that the Clips just really doesn’t work that well.
Now at its core, the Clips is a simple automatic point and shoot camera that’s similar looking to a GoPro. But it’s considerably smaller. It has a fixed focus lens, with a 130-degree wide angle field of view, a single button, and a few LEDs. The camera doesn’t have a display or user interface of its own. It connects to an iPhone, Google Pixel, or Samsung Galaxy S9 through Bluetooth and WiFi Direct, to download the images that it captures, and control it.
Inside is what’s supposed to make the CLips special. It’s running Google’s people detection algorithms, to recognize familiar faces and interesting activities, and then capture those moments that you care about automatically. During this time the Clips is not recording video or sound. It’s technically shooting a bunch of still images, at roughly 15fps. Then it constructs them into 7-second clips, from which you can edit or pull stills out of. It’s basically making high-resolution GIFsout of the sequences of images.
You can use this big button on the front of the camera to force a capture. Or you can use the app on your phone to see what the camera is viewing and take shots there. The whole point of Clips, is to let the camera and Google’s algorithms do all of the heavy lifting. That way you should be able to enjoy your time and review the memories that Clips captured later on. To facilitate this, CLips comes with a silicone case, that makes it easy to prop up almost anywhere, or even clip it to things.
However, it is not designed to be a body camera. You’re supposed to set it down and leave it alone, for the most part. There are other accessories that you can buy like a case that has a tripod on it for capturing different angles. Otherwise, using the Clips is as simple as turning it on, and then putting it where you want it.
Link with Google photos
You can adjust the frequency of captures in the Clips app. You can also train it, with people that matter to you by linking it with you Google photos account. The Clips’ camera is supposed to learn the faces of important people by who it’s exposed to most often. By using the Google Photos data it’s supposed to speed things along. You can also push the button on the front of the camera to take a direct portrait of someone that you want the Clips to prioritize.
Google takes all of that data and then tries to figure out whenever your chosen people are in the frame and doing something photogenic like smiling or dancing and then automatically capturing a clip.
Now, pets work a little differently because Google’s algorithms cannot tell various similar looking animals apart from one another. It basically just looks for any time it has an animal in its frame and just opens its shutter.
Google stresses that all of this is happening locally on the Clips device itself and that nothing is happening in a cloud. And in fact, the Clips doesn’t have a way of connecting to the internet.
Once the camera has captured a bunch of clips you use the app to browse through them on your phone. You can edit them down to shorter versions and take still images out of them. Or just save the whole thing to your phone storage for sharing and editing later. The Clips app is supposed to learn based on which clips you save and deem important. And then it prioritizes capturing similar clips in the future. You can also hit a toggle to view suggested clips for saving. It is basically what the app thinks you’ll like out of the clips it has captured.
I’ve been trying to test Google Clips with friends and family at events for a couple weeks and I cannot say that I am too terribly impressed or happy with the results. I am used to being the one to take the photos of our events, whether they are candid or posed moments. it’s hard to trust that the Clips camera is going to do what I normally do. For example, I found that it wasn’t taking shots all that often. So, I tweaked the settings in the app to increase its frequency. But it still seems to be very conservative with what it will actually capture.
Google says that the Clips is supposed to let you not worry about taking videos and photos, and just allow you to enjoy your time. They also admit that putting the CLips in one spot and leaving it there isn’t ideal either. You really have to move it around and put it in different places and angles to get the best results. At that point, I may as well just be using my phone.
On top of all that, the Clip’s hardware just isn’t very good. The images it captures are flat and grainy, and often have a lot of motion blur. Especially indoors where I use it the most. The 15fps don’t really make for smooth video, and there is no sound for the videos either. At the end of the day, there were just not many clips that I thought it captured better than I could with my phone, and in most cases, the image quality was so poor I just wished I had used my phone, to begin with.
There is no doubt in my mind that I could try and improve its capabilities by changing locations and really taking notice of how and when the wide-angle lens is working best, but, I’m not convinced that with the efforts put forth that I am going to be happy with the need results. Not only it’s cost is hard to justify but also its interface is kind of a pain to deal with via the app. It’s easier to use the phone I already have and use the better images from the get. Hopefully, Google can get a grip on their idea because this idea is definitely super cool.