How the smartphone camera changed everything

From the way we communicate to how many photos we take a day, the camera in your phone has literally changed the way we live. By James Lu, Alvin Soon, and Marcus Wong

Portrait of Tammy Strobel

From the way we communicate to how many photos we take a day, the camera in your phone has literally changed the way we live. By James Lu, Alvin Soon, and Marcus Wong

My Reading Room

The camera phone has swallowed the world

Who made the first camera phone? It depends who you ask. In 2000, Samsung and Kyocera both released their first camera phones in South Korea, while Sharp did one in Japan. The Samsung SCH-V200, for example, sold in June that year. It could take 20 photos at a meager 0.35 megapixels (MP), and you had to hook it up to a PC to extract the photos.

Camera phones have advanced dramatically since. The Samsung Galaxy S9+, for example, shoots 12MP photos and 4K video. With 256GB of internal storage, the S9+ can store more than 50,000 images. Besides advancing in technology, the camera phone has led to a photography explosion. For the past seven years, the most popular camera on Flickr has been a smartphone. InfoTrends estimates that in the past five years, the number of digital photos taken each year has surged from 660 billion to 1,200 billion.

The market research company believes that smartphone cameras are behind the growth; it estimates 85 percent of all images captured in 2017 were shot on smartphones. It’s little wonder as smartphone cameras are everywhere. It’s estimated that in 2017, nearly 1.54 billion smartphones sold worldwide.

There’s a saying that “the best camera is the one that’s with you.” If so, there’s no wonder why smartphones have led to a rise in the number of photos and videos. Before, we’d have carried our cameras only for special occasions. Today, we photograph everything with our phones, from holidays, to lunch, to silly selfies.

Besides changing how we photograph, smartphones have also changed how we communicate. Instead of calling someone back or typing a reply, we send videos. Instead of jotting down a reminder, we simply snap a photo. You can get a news update from a video posted on social media faster than you read about it in the news. It’s amazing to think that these are now commonplace practices that weren’t around a decade ago.

But the smartphone camera’s success has come at the expense of the digital camera. According to CIPA, the total shipment of digital cameras peaked in 2010, with 121 million units. In 2017, CIPA estimates that only 25 million digital cameras shipped — a shipment number not seen since 2002.

Camera manufacturers have an optimistic rebuttal to their annexation: smartphone photography will lead more users to digital cameras. 

The more users become enamored with photography, the reasoning goes, the more they’ll crave the better image quality that ‘serious’ cameras provide.

It’s a limited dataset, but Flickr does report 8% more DSLR photos being uploaded in 2017 than 2016. But at the same time, smartphone uploads grew by 2% and made up half of all photos uploaded to Flickr. And the latest data from CIPA also shows that DSLR shipment numbers for early 2018 have been the worst since 2012.

It looks like the future is rosy for smartphone cameras. Today, the estimated number of smartphone users stands at 2.53 billion, and is estimated to grow to 2.87 billion by 2020. Smartphone cameras are advancing, even introducing features unavailable on digital cameras.

People are calling this new field of pioneering digitatl technology ‘computational photography.’

Smartphones, for example, were the first cameras to shoot handheld HDR (High Dynamic Range) photos. Previously, cameras had to be mounted on tripods to capture HDR photographs, which had to be composited using HDR software on a PC. On a smartphone, it was all done automatically.

The latest smartphones take computational photography even further. The Samsung Galaxy S9 series, for example, captures up to 12 images for a single photograph. It uses these 12 images to compose a final image that’s richer and more vivid than possible with a single capture.

The S9+ also has a dual camera system and the Live Focus feature, which produces an image with a simulated blurry background. Also known as bokeh, this effect was previously only achievable on cameras with high-end lenses.

Smartphone cameras are seen as inferior to digital cameras, but computational photography helps to make up for the quality gap. And unlike digital cameras, the smartphone is always with us, it’s more compact, and it can share images far more easily. The camera phone has come a long way since the days of 0.35MP photos.

f/2.4 telephoto, 1/1,250 sec, ISO 50 taken with Samsung Galaxy S9.
f/2.4 telephoto, 1/1,250 sec, ISO 50 taken with Samsung Galaxy S9.


Digital Photography in Numbers

Sources: Flickr, InfoTrends, CIPA
Sources: Flickr, InfoTrends, CIPA


f/1.5 with wide-angle, 1/10 sec, ISO 400 taken with the Samsung Galaxy S9.
f/1.5 with wide-angle, 1/10 sec, ISO 400 taken with the Samsung Galaxy S9.

Controlling the light

When we talk about camera lenses, one of the biggest considerations is the aperture. This is the mechanism that creates a hole at the back of the lens, which allows light to fall onto the camera’s sensor.

This hole is created by a set of interlocking blades. They open and close and adjust in size, much like how the iris in our eye expands or shrinks. Like how our irises let our eyes adjust to the amount of light around us, the aperture of a lens controls exposure.

Because of the way glass refracts and reflects light, aperture also changes the depth of field in an image. The larger the aperture used, the shallower the depth of field you’ll get in your images. This means that everything other than your main subject goes out of focus in varying degrees.

Because the sensors used for camera phones are so small, manufacturers pair them with large aperture lenses for better performance in low light. The Samsung Galaxy S9 series uses a lens that has dual apertures. There’s a narrower f/2.4 setting for shooting in brighter places, and a wide f/1.5 setting for low light.

The camera phone and the rise of visual communication

As mentioned, smartphones have changed the way we communicate too, and computational photography plays a big role in this evolution. Go ahead, think about the last time you had a purely text-based conversation on your phone that did not include any pictures, filters or emojis. You're probably also wondering why your mobile provider continues to insist on giving you all these hundreds of "free" SMSes with your mobile plan.

Smartphones essentially made standard texting obsolete with the introduction of rich media apps like WhatsApp, Facebook Messenger and Line. Just like the instant messengers on a PC, you were able to craft and reply to messages in real-time, peppered with a mix of cute emojis and stickers. The phone camera became indispensable in fueling the boom of social communication and selfie culture. At this point however, the camera on our phones were still the “dumb” variety, limited to just shooting standard stills and video.

The turning point happened in 2015, where two major events shaped smartphone camera development over the next two years.

Firstly, Oxford Dictionaries named as its Word of the Year 2015. That’s right, the Word of the Year for a major dictionary wasn’t even a word, but a pictogram. According to the study conducted by Oxford Dictionaries and Swiftkey (the makers of the keyboard app), the use of emojis more than tripled in 2015 over the previous year. The report also noted that emojis were no longer confined to teens and younger users, but had reached widespread mainstream acceptance. A major Hollywood movie was even made based entirely on emojis. The film was expectedly bad, but that's how relevant emojis have become in society today.

The second catalyst came from Snapchat. In 2015, Snapchat introduced Lenses, a face detection feature that used the phone camera to map your face and apply, in real-time, fun effects and masks for its video microblogging platform. This feature was quickly adopted by rival social media apps like Facebook and Instagram. As phone cameras and processing power improved, the quality of these augmented reality (AR) effects have also become increasingly accurate and sophisticated. Today, you can easily make an entire video call on your phone looking like a cute bear or wearing a monkey-controlled spaceship hat.

By April 2017, during the F8 developer conference, Facebook CEO Mark Zuckerberg readily acknowledged that the future of social lies in AR. In September 2017, Apple showcased Animojis with the iPhone X. And for the first time, AR face mapping technology and emoji-fueled communication converged.

In February 2018, Samsung announced its Galaxy S9 series, taking the emoji game to a whole new level. Termed AR Emoji, Samsung Galaxy S9 devices combines its camera and AI processing to map more than 100 different facial features onto your very own custom 3D avatar. This avatar can then be used as a live-action emoji or animated stickers in most popular social media, chat or messaging apps.

These developments might sound frivolous in nature, but camera phone innovations and computational photography is powering the nextgeneration of visual communications.

Mark Zuckerberg was right. Our whole future lies in the palm of our hands, through the reality bending lens of the smartphone camera.

My Reading Room
My Reading Room
Facebook AR filter
Apple Animoji
Samsung AR Emoji

"Samsung Galaxy S9 devices uses its camera and some AI processing to map more than 100 different facial features onto your very own custom 3D avatar."

My Reading Room

The need for speed

After megapixels, the next pursuit for camera makers is speed. This has been especially so with more people shooting video. Faster processing is needed to enable features such as slow-motion video.

When shooting slow-motion video, you’re capturing frames at higher rates and then playing them back at normal speed. It’s a lot of information captured in a short time, especially when shooting at high resolutions.

To handle the load, the first step has been to introduce faster processors. Another change has been to the design of camera sensors. For example, moving the wiring layer below the photodiode substrate allows a sensor to capture light more efficiently.

On a typical sensor, a high-speed signal processing circuit is placed together with the photodiode substrate. But on stacked sensor designs, the processing circuit is moved to a separate layer. Integral memory is added to this layer to buffer operations. More information can now be collected in comparison to only using the imaging processor’s on-board memory. 

These changes to sensor design mean the camera can keep taking images instead of having to wait for the processor. This enables the hyper fast capture speeds needed for super slow-motion video. Some phones like the Samsung Galaxy S9 series go even further. Samsung’s ISOCELL imaging sensor isolates each pixel on the photodiode substrate to decrease crosstalk and increase capacity. This provides increased speed, wider dynamic range, and better image quality.

Advanced smartphone cameras like the Samsung Galaxy S9 can capture up to 960 frames per second.
Advanced smartphone cameras like the Samsung Galaxy S9 can capture up to 960 frames per second.



Before smartphones, mobile gaming required a handheld console, and was dominated by the likes of Nintendo and Sony. Games were expensive and required a physical cartridge to play. Multiplayer gaming was practically unheard of. Smartphones completely changed that. You no longer need a dedicated handheld console; you already own everything you need. Games can be downloaded at a fraction of the cost, often for 99 cents or even for free. And there are about 4.7 billion people out there you can play with.

My Reading Room


Mobile processors are now so powerful we’ve reached a point where anyone with a smartphone can handle almost all of their PC tasks. Samsung own DeX accessory can run a Galaxy device with a PC-like experience. Add better battery life, vastly improved networking speeds, and larger screen sizes on mobile devices and tablets, and it’s easy to see why the global PC industry has been on a downward trend for the past few years.

My Reading Room


Before smartphones, if you wanted to watch a TV show you sat in front of the TV, and if you wanted to watch a movie, you went to the cinema. But today you’re more likely to find high-resolution OLED displays with perfect colour accuracy in your pocket than your living room. The best display you own isn’t your TV; it’s your smartphone.

That’s why every major movie and TV studio has expanded their distribution to include downloading and streaming services onto mobile devices. According to a recent report from Netflix more than 50 per cent of its content is viewed on mobile devices.

My Reading Room