Smartphone manufacturers have generally settled on having 30 or 60 fps available for video recordings, but some are pushing the boundaries, offering 120 or even 240 fps. Are they viable or just a marketing trick to make us buy more expensive and advanced, but not necessarily more useful smartphones? In this article, I’m going to answer a few questions that you may have regarding video recording. Is 30 fps enough? Is 60 fps video recording even worth using? What’s the best frame rate for video recording on smartphones? I analyzed different recording scenarios and I’m ready to settle this “30 vs 60 fps video recording” debate:
Any camera, digital or analog, records moving images as a series of still photos, and then plays them back in very quick succession, which our eyes perceive as motion. To put it simply, the frame rate indicates how many still frames are recorded (and then presented) in each second for a given video.
When you look at the settings for the Camera app on your smartphone, you will most likely come across the following frame rates: 24 fps, 30 fps and 60 fps. Some high-end smartphones, like the Sony Xperia PRO-I, can also record at 120 or even 240 fps. Why those numbers and not, for example, 34 fps? It has to do with how electronics and electrical current work and also, perhaps surprisingly, with habit.
Back when silent movies first appeared, it made sense from a financial and a technical point of view to shoot them at the lowest possible frame rate that would still preserve the illusion of motion. The frame rate often varied from scene to scene (within the same movie), and theaters would also show pictures at different frame rate speeds. However, once sound was added to the images, the industry decided to enforce a standard, because the human ear is more sensitive to changes in frequency than the eyes, and the sound was linked to the frame rate. Without getting into technical details, 24 fps was chosen because it made the construction of recording and playback equipment simpler.
As for 30 and 60 fps, they were born out of an even more technical reason: the way that electrical current is delivered in the USA and other parts of the world. The current alternates its polarity (that’s why it’s called AC, or alternating current) at a frequency of 60 Hz, or 60 times per second. Thus, television sets could easily display 30 images per second, without the need to complicate the electronics inside (60 fps was still beyond reach at that time). There is more to that, but if you want the full history, check the Wikipedia page on frame rates.
Naturally, you would think that shooting at the highest frame rate possible is the way to go, and to an extent, you would be right. Researchers have shown in a paper published in 2014 that the human brain can process images shown for as little as 13 milliseconds. If you stack the images up and create a video, 13 ms for each image corresponds to a frame rate of roughly 75 frames in a second. Subsequent testing has shown that our brains can even adapt to 90 or more images per second. Just look at professional gamers, and you will see that they prefer 120 Hz monitors or even faster. However, there are quite a few things to take into account when recording a video, especially on smartphones. Let’s take a look at what changes when you modify the frame rate:
First of all, let’s consider exposure. We already established that a video is just a collection of still images. The shutter takes a picture, then closes, then opens again, etc. When recording at 30 fps, the shutter can stay open for a maximum of 1/30 seconds, taking in more light. Of course, depending on lighting conditions, it might not need to, but let’s focus on the maximum light that can be gathered by the sensor. If you increase the number of frames per second to 60, the amount of time that the sensor has at its disposal to gather light for each frame drops to half, 1/60 of a second. This amount of time is perfectly fine too, if you record in good lighting conditions, but if you record at dusk or in less than optimal conditions, the camera will have to increase the sensibility of the sensor (the ISO setting) to compensate, which in turn decreases the quality of the captured images and introduces what’s known as image noise - artifacts that show up in darker areas of the picture.
If you go beyond 60 fps (for example for slow-motion videos, that are recorded at 240 fps, or even 960 for Samsung’s Super Slow-mo mode), the effect becomes even more visible. To record a good-quality high frame rate video, you need your subject to be extremely well lit.
In a nutshell, the lower the frame rate, the more time the camera has to gather light, and the greater flexibility you have for recording in low light.
The second thing to take into account relates closely to what I mentioned in the previous chapter - image exposure. Let’s say you are recording a fast-paced race. At 24 or 30 fps, since the shutter can be left open for more time, the resulting images can have more motion blur (since the subjects move more while the sensor is capturing the image). When recording at 60 fps, you force shorter exposure times, and thus you can potentially halve the blur for fast-moving subjects. On digital cameras, the blur is also affected by the resolution of the recording. Make sure you check out our article on what resolution to choose when recording with your smartphone for more details.
Here is a video recorded at 30 fps with the Xiaomi 11T:
And here is a still image with a fast moving object from that video:
Now, watch the 60 fps recording:
If you look closely, you can see that motion blur is reduced:
The conclusion? The higher the frame rate, the lower the motion blur. Is that a good or a bad thing? The answer depends on what you are trying to achieve with the recording. The human eye naturally registers motion blur in real life, so motion blur in recordings often mimics reality better.
We’re used to watching TV and going to classic movie theaters. These mediums present the images at 30 and 24 frames per second, respectively. So our brains have learned to associate a “good” experience with these frame rates. Even if 24 fps is kind of low for fast-paced action movies, you expect this and your brain adjusts. Try this: right after exiting a movie theatre, watch a short video shot at 60 fps. It looks awkward, doesn’t it?
It’s already in the history books that, when director Peter Jackson decided to record and release the movie “The Hobbit” using 48 fps, many critics and many viewers found the fast frame rate to be distracting to the point of ruining the movie.
The smoothness of the video makes it look unnatural. So, when making a recording, first consider its purpose. If you want a casual recording, 60 fps might bring you closer to the event, but if you plan on doing something more artistic, perhaps 30 fps is a better idea.
So we can conclude the higher the frame rate, or the farther from 24 fps, the more unnatural the recording looks. It’s completely subjective and it will change for future generations, as more and more people are exposed to high frame rate recordings, but for now, our “monkey brains” say No.
Next, let’s move on to more technical aspects. A video shot at 60 fps requires more processing power, as more images are recorded in the same interval. So I devised a simple experiment: using the Samsung Galaxy S21 FE, I measured the battery drain while recording a video with the same resolution (4K) but at two different frame rates, 30 and 60 fps.
The results speak for themselves: for a 10-minute video shot at 30 fps, the battery dropped by 4%, while for the 60 fps video, the battery took a hit of 5%. This equates to a 25% increase in battery consumption. Moreover, the smartphone was noticeably hotter after recording the 60 fps video (I let the smartphone cool down between recordings).
Recording at higher frame rates consumes more energy and generates more heat. Consider this when low on battery or recording in very hot environments.
Of course, since there is roughly twice as much information recorded, the file size increases when switching from 30 to 60 fps (it should double, but video compression algorithms improve this). This shouldn’t be an issue if you rarely use video recording, but if you routinely record clips, let’s say for a YouTube channel, you might run out of space sooner than you think. Let’s make a simple calculation by recording two videos in the same location: on the Samsung Galaxy S21 FE, a 5-minute 4K/30fps video takes up 1.7 GB, while a 4K/60fps with a similar length takes up 2.53 GB. Now, on a typical 128 GB storage (pretty much the standard for modern smartphones), deducting the system and app files, you are left with roughly 100 GB. This gives you just enough space for fifty-eight 5-minute videos in the first case, thirty-nine in the second. Now consider the fact that many modern smartphones, contrary to the “modern” naming, don’t have expandable storage, and you can more clearly see the issue.
Furthermore, file size directly affects upload times. If you want to share a video on social media or send it to a friend, unless you have a good bandwidth and an unlimited data plan (or Wi-Fi), sending will take a lot of time (and will also consume much more battery and mobile data).
Thus, recording at higher frame rates fills up your storage space faster, which might be an issue if you record many videos and don’t move them periodically on a different device.
As technology advances, this is quickly becoming a non-issue, but for the time being, it’s worth mentioning that if you are recording at higher frame rates, you lose some of the technical innovations available for standard frame rate recordings. Just to give you a quick example, the Samsung Galaxy S22 Ultra has a feature called Super Steady Video (SSV) which is reserved for 1080p videos and 30 frames per second. Going above 30 fps removes this option. Depending on your smartphone model, you might get even more restrictions, such as the lack of standard video stabilization or even lower resolution (many last-gen smartphones could only record at 60 fps in 1080p resolution). Just take a look at the Settings menu from an older Huawei P30:
At 60 fps or higher, at least for the current generation of smartphones, you can lose some of the flexibility and features integrated into the camera hardware and software.
Taking all things into account, is it worth recording at 60 fps on your smartphone or should you just stick to 30 fps? Should you go even higher, if your smartphone allows it? In my opinion, in 2022, the answer depends a lot on what you plan to do with the recording. I personally lean towards 60 fps, but it depends on the subject, the recording conditions, and what the footage is for. Here’s the breakdown:
- Recording at 30 fps is just fine if you want to record the live performance of your favorite band or share a funny situation with your friends on social media. You get the full processing features of your Camera app and smartphone hardware, more performance in low light, the battery hit isn’t as large as with 60 fps recordings, you can store more videos on your device, and the recordings don’t take ages to upload. Recording at 30 fps, is, thus, sufficient for general, casual recordings, both in good-lit environments and in dim light.
- Recording at 60 fps gives the viewer a more immersive experience, but it’s also less “cinematic”. You should consider the subject: if it’s a fast-moving one, like a Formula Indy car or your 5 year-old kid, 60 fps is smoother, with less motion blur. The quality decreases rapidly if the lighting conditions are less than ideal. Also, when recording at 60 fps, the size of the file is greater and the smartphone consumes more battery than when recording at lower frame rates. On the other hand, you can edit the video further and slow it down, while retaining an acceptable frame rate. Thus, recording at 60 fps is perfect for fast-paced videos, in good lighting conditions. Depending on your smartphone model and technology, there might be some drawbacks to using 60 fps, such as lower resolution or the lack of stabilization.
- Recording at 120 fps or more, available on high-end camera-phones, like the Sony Xperia PRO-I, is good only in very specific use cases. For example, if you want to record and then edit the videos you take, recording at 120 fps allows you to slow down the video as much as four times, while still being able to output a 30 fps video. Note that the higher you go, the better lighting conditions must be, as the shutter speed increases. The same disadvantages from 60 fps recordings remain, but are accentuated: higher battery consumption, larger files, restrictions in regards to resolution etc. Not only that, but 120 fps videos can’t be shared on platforms like YouTube, since the platform restricts the frame rate at 60 fps. (Fun fact: you can bypass that by slowing the video down, uploading it at 60 fps, and then playing it back at 2x speed). I cannot recommend going over 60 fps for normal recordings, only for videos meant for post-production.
You can go lower than 30 fps on some smartphones (usually 24 fps, as a limitation for 8K video recording), but that, too, isn’t something I would recommend doing, as videos shot on smartphones tend to contain rapid camera movements, and when you combine that with the low framerate, you get a lot of judder and visible stutter.
I’d love to know your opinion on the different frame rates for recording videos on your smartphone. Do you think manufacturers should focus on delivering even higher frame rate capabilities for our smartphones? Or should they rather focus on equalizing the features available on existing ones? Let me know in the comments below.