CameraBag 2 filters on some of my iPhone photos from our trip to Stockholm. If only there was an easy way to use all these fancy layered filters I’ve created on my laptop without having to transfer my iPhone photos to my laptop…
CameraBag 2 filters on some of my iPhone photos from our trip to Stockholm. If only there was an easy way to use all these fancy layered filters I’ve created on my laptop without having to transfer my iPhone photos to my laptop…
CameraBagged photo of Stockholm from the Sony RX100
I was recently preparing to spend a few weeks in Sweden with some of the Nevercenter crew, and found myself in B&H Photo in New York where 2 of the new Sony RX100s were in stock (they’d had a habit of going out of stock). I had to make an impulse decision - I was planning on taking my Canon 5D mkII to Sweden, but I’ve really been getting tired of lugging it around everywhere when I’m trying to enjoy myself doing day-to-day things.
I decided to give the smaller camera a go, figuring I could easily offload it if I kept ending up wishing I was using my 5D instead. I brought both with me to Sweden, and now two weeks into the trip I haven’t taken the 5D out a single time. Which is funny, because I’ve spent a lot of time in my life making fun of how bad Sony products are - I feel silly even now recommending a Sony camera to people.
Here are some of my favorite features of the RX100:
- Obviously its main feature is the big sensor and wide aperture in a small body (f1.8 when zoomed all the way out), letting you get both a nice shallow depth of field in your shots and great low-light performance, all in a compact camera.
- I find myself using the P mode for photos a lot - sure you’re less in control of the camera, but I feel like this thing is particularly good at getting the right exposure in shots. You can always go full manual when you want to.
- When shooting in P mode, when you turn the dial on the lens it gives a great display readout that shows you all your options for different aperture/shutter speed combinations possible to get the same exposure.
- Speaking of getting the right exposure, there’s a great setting called D-Range Optimizer (presumably the D stands for “Dynamic”) that analyzes your photo right after you’ve taken it and appears essentially to adjust the light response curve to automatically bring up shadow detail or tone down bright highlights, using the extra detail in the high-precision RAW sensor data, before writing it out to a jpeg on the memory card. It’s a similar idea to HDRI, but looks very natural, much as your eye and brain actually see scenes with very light and dark portions. The camera shows you the before and after versions in quick succession after you take the shot, so you can see what it’s doing.
- Another neat feature I just discovered is a multi-exposure high-ISO mode, where it’ll take 5 or so photos quickly and use the combined data to get a less-noisy image than if you’d taken a single shot at the same ISO level.
- In general, the ISO controls are nice, letting you set minimum and maximum ISO for auto ISO modes.
- Using my 5D for video, I’d gotten used to the idea of not being able to use autofocus in videos, but the autofocus on the RX100 works great for the most part in video mode. I generally don’t think about focus at all while shooting videos.
- I like the ability to set this camera to show a red (or another color you choose) outline around objects that are in focus, which is helpful when doing manual focusing.
- There’s a good amount of customization possible for the various buttons and dials. I can almost get everything mapped to where I want it. Almost.
Here are some frustratingly almost-there features of the RX100:
- There’s a dedicated movie button, so when you’re shooting photos you can easily start recording a movie without changing the dial to movie mode. However, I always want to shoot my videos at 1/60th of a second shutter speed to get a cinematic look on 30fps videos, but the RX100 always goes into an automatic exposure mode for videos when I’m in a still photo mode and initiate video recording using this button. Then it’ll adjust the shutter speed to whatever it feels like. I wish that if I was in shutter priority mode shooting stills and then pressed the dedicated movie button, it would shoot my movie in shutter priority mode. You can do shutter priority mode for videos, you just have to set the camera dial to movie mode, then choose shutter priority from the menu that automatically pops up there.
- The autofocus is really nice and works most of the time, but I wish it would let me press the shutter halfway when I’m in movie mode to force it to focus on what’s in the center of the screen. It doesn’t, and sometimes I just have to go into manual focus, set the focus to the desired spot, then go back into autofocus mode.
Here are some really bad design decisions made with the RX100:
- In general, the camera has an awkward and unnecessary menu setup where there are several positions on the main mode dial that send you to a submenu before you can start shooting. For example, take the movie mode position on the main dial - when you turn the dial to this position, you’re then presented a menu screen where you choose manual, aperture priority, shutter priority, or automatic video mode. It would make much more sense to just be able to set the dial to the corresponding mode for still shooting, then press the movie button and have it use the same settings for video. Also, getting back to these submenus either takes digging through the main menus or turning the dial away from its current setting and back again.
- Videos can be in either AVCHD or mp4 format, but the mp4 format can’t record at full HD resolution (it uses non-square pixels, so it’s almost full HD at 1440 by 1080), nor can it record at the higher 60fps that the AVCHD format can.
- Your video files are browsed separately from your still photos, so if you shoot a move and then a still photo, you won’t see your movie when you press the review button until you navigate through the menus to get back to video browsing. Moreover, the AVCHD videos are stored in a different gallery than the mp4 videos.
- AVCHD videos are stored in a really annoying and terrible format where all your videos are basically stored in a single file with multiple embedded clips. Seriously annoying.
- You can’t record videos at 24 frames per second, only 30 or 60 (unless you live in Europe, then it’s only 25 or 50). Videos at 24 fps look much more movie-like. Canon learned that this was super important to people and included it in a firmware update after stupidly not including it in the original firmware for the 5D mkII. I won’t be using this camera for any filmmaking other than fun little travel videos - I’ll still need my 5D.
And here’s Sony being Sony:
- When you’re choosing the format and size for movie mode, it gives you a warning when your format (which includes most) can’t be burned directly to a blu-ray disc. Sony, I promise you, nobody wants to burn their digital camera movies to a blu-ray disc, give it up.
In the end, the pros outweigh the cons. Hopefully (but don’t actually get your hopes up) Sony will update the firmware, or (slightly get your hopes up) someone will hack the firmware to fix the annoyances with this camera that, from my perspective, are all eminently solvable software issues.
One of the most fun things about having our first hit iPhone app was seeing it start to be used by celebrity-type folk. Not that we’re the type of people to be impressed with celebrity for celebrity’s sake, but that you feel like you’re suddenly starting to help shape a little corner of pop culture. One day you’re making a 3D modeling application that only gets seen and appreciated by true geeks, and the next you’re seeing your app in Apple commercials, and Ashton Kutcher starts posting photos of Demi Moore taken with something you made. Weird.
Of course, in the iPhone world, it doesn’t take long for someone else to see your idea and either copy it outright or copy central elements of it and add something else. Particularly funny was to see a few apps use our exact filter names - for example, we made up the filter name “Helga” as a fictional toy camera, and some other app makers thought that it was a real camera name and put in a filter named Helga in their own app that had a similar style. We had one popular professional photographer who loved using CameraBag and was a great promoter of it, but then we noticed he went dark, so to speak, and didn’t interact with us or talk about CameraBag anymore, for several months. Lo and behold, he’d decided to go out and created his own vintage filter app, which was released on the app store a few months later.
This brings up an interesting point that is hopefully becoming more apparent to app makers or aspiring app makers in general. That is: a great idea is not worth that much in the app world, in the end. App ideas are incredibly easy to copy and improve upon. We get people contacting us all the time saying “I have this great app idea and I was wondering if you’d be interested in making it and we’ll split the profits.” Unless you can offer really great money upfront, no programmer worth their salt is going to be enticed by such a proposition. Execution and elegance are the real valuable app components. CameraBag on iPhone is getting quite old and shows its age, yet there’s a simplicity to its interface and design that still appeals to people 4 years on. We’re hoping that CameraBag 2 for iPhone can retain that elegance while really advancing the photo app genre.
The most prominent vintage photo app that has sprung out of CameraBag’s success is, of course, Instagram. People ask us sometimes if we wish we’d worked to build a social element into CameraBag and been the ones selling to Facebook for a billion dollars. I would, of course, love to have a billion dollars, no doubt about that. But our company has this thing built into us where we just really want to make products that people love enough that they’re willing to pay enough for them to give us a good living. In other words, building a business that’s only projected to lose money for years and years and whose ultimate chance of making money depends on having a giant company buy it just doesn’t sit well with us.
Nevercenter’s been going for nearly 10 years now, and we’ve always been profitable and never used loans or investors. It’s fun sometimes to list all of the big talked-about companies who we’ve made more money than, because so many of them just lose money, sell out, and then disappear within their mother corporation. This isn’t a “we’ll never sell out” screed, it’s just to say that we don’t want to participate in the tech-company-that-loses-millions-then-sells-out-for-more-millions cycle that ends with a few wealthy folks and an eventual popped bubble.
I won’t go into too much detail about building this filter - you can download it and the original image and take a look for yourself. You’ll recognize this photo as the default thumb image in CameraBag 2. I took it when I was living in Tokyo a year or two ago, on the grounds of the Imperial Palace just after sunset. What a magical place to be!
In these variations I’ve basically just remixed the Lightleak tile, which I think produced some great variations - download the filter and the original image and try for yourself! Here are the files: http://www.nevercenter.com/blog_files/DarkLightleak.zip
We had some big debates about when to call the app finished and let it go live on the app store. We had uploaded it a few times and then realized it was crashing (due to the terrible design of Objective C, Apple’s required programming language), and had to cancel the submission and try to fix the out-of-memory errors. I’d say it wasn’t worth it to let the app go live with some thing or another not how we wanted it, and John would argue that it was better just to let it go up sooner, as it was.
When we finally agreed on a good-enough build and got it submitted, approved, and on the app store, there were a handful of other photo apps (I think about 50 total) but they all did things like general image adjustments, nothing like vintage camera filters. We set the price at $5 originally - there wasn’t much precedent for app pricing so it seemed reasonable at the time. We just had no idea what the response would be, so we were extremely anxious to check out the numbers after the first day of sales. I believe we sold somewhere around 100-200 copies that first day, which we were really happy about since that extrapolated to making maybe $20,000 or $30,000 per month off of this app that took us really only about a month of full-time work to make.
A few days after the launch, I was looking at the app store in iTunes and found to my great surprise that CameraBag was chosen as a featured app on the front page of the app store. John and I had driven the day before from where we were living in Seattle down to our parents’ house in Utah, so the whole family began watching intently to see what would happen with the sales numbers after being featured. We could see CameraBag moving quickly up the top apps list, but the sales numbers for a given day don’t post until very late at night or early the next morning so you can imagine the suspense. We got the numbers at 3 or 4 in the morning, everyone was still up, and it came to somewhere around $8,000 for the day. Not bad for 3.5 employees!
CameraBag kept rising, and it maxed out at the number 2 spot of all paid apps. It’s possible it reached number 1 briefly, but I don’t think I ever personally saw it there. But it was definitely clear that this was going to be a popular category of app. We didn’t know how long we’d stay up in that rarefied position, so there was never a sense of striking it rich. We mostly enjoyed the flexibility the money would give us for the time being, but anxiously watched to see how long it would last.
As we’ve learned with several apps since the launch of the original CameraBag, after you stop being featured by Apple your numbers usually start dropping right away, quite steadily, for a long time. A few apps have seemed to find a way to stay at the top, but not many. If you start high enough though, that can still be very fruitful for a very long time.
In the final installation of this series, Part 3, I’ll talk about watching what happened when people everywhere started using CameraBag.
Hi, this is the first in what I hope to be a long series of posts about building up a filter in our desktop photo software, CameraBag 2. I’m going to show how I build up a filter to get the modern over-saturated look so popular with digital SLR users (think wedding and family photos). I’m actually using a still frame from a video I shot of a friend after she ran the Brooklyn half-marathon.
You can download the original image and the filter I created here:
To see this in action, I’d recommend that you try loading in the original image and the filter and then individually disabling each of the tiles, then enable each of them one by one as I explain each step to see the effect of each in sequence.
Step 1: RGB Curve
I started with the RGB Curve tile from the Adjust tab to add a little bit of contrast while also brightening the midtones. You can see I’ve pulled the lower end of the curve down just a bit to darken the shadows, while the rest of the curve is pulled above the midline to brighten everything else. Interestingly, CameraBag uses a different algorithm than Photoshop for controlling its curves, which in my opinion makes them much easier to get control of to make exactly the kind of curve you want.
Step 2: Color Corrector
I wanted to warm up the colors a bit, so for this I use the Color Corrector tile in the Adjust tab. I left the Color Method on Colorize (which I find works best for most color corrections) and then pulled the curve up to the left so that the darker midtones would be more affected than the lighter midtones. If, for example, I had pulled the left endpoint up, all the darkest/black parts of the image would be colorized towards the selected color. I just eyeball the color for what looks natural with the particular photo, and you can see I’ve selected a warm tan color. Try messing with the curve for different looks.
Step 3: Lolo
After this, I felt like the image was still missing some pop, so I added in a Lolo tile from the Styles tab, which simultaneously increases saturation, contrast, warmth, and does a few other tricks to make colors pop. However, Lolo at full strength was too much, so I toned it down a little with the Amount slider, and then used the Remix slider until I found just the right version of Lolo.
Step 4: Vignette
A slight vignette helps to give a little more focus on the center of an image, so I added a Vignette tile and left it at its default values other than turning down the Amount slider quite a bit to make it subtle. I also dragged the Vignette tile all the way to the left so that it gets applied first before the other effects. I find that placing the Vignette tile at the start of the effect chain consistently makes it look best/most natural, like a true vignette caused by a real lens.
Step 5: Tweaking
I always go back and tweak each tile after building up a filter, just to see if changing anything when they’re all combined together improves the image at all. It’s easy to undo any changes that aren’t for the better, and you can never really tell what some tweak is going to look like until you try it, so why not?
If you load in another photo with this same filter, you may find that you’ll want to adjust the RGB Curve tile a bit to get the right tone range for that particular photo. However, here’s a neat place where CameraBag shines: instead of messing with the existing RGB Curve tile, just add a second one to the left end of the tray (or possibly right after the Vignette tile) and do your additional adjustments there. In CameraBag, you can have as many separate RGB Curves as you want, without losing image quality, because CameraBag operates in 32-bits-per-channel, non-destructive floating point color space. Many other programs either force you to use only one curve, or they apply each separate curve destructively rather than in a high-accuracy non-destructive mode like CameraBag.
Hope you enjoyed this, download the files above and try it for yourself!
Hello, welcome to the Nevercenter developer blog! I’m Tom, the founder of the company and one of the two developers, the other being my brother John. We’re joined by our great friend and web developer extraordinaire Stephanie. I’m going to kick off this blog by telling a bit about the birth of CameraBag, which launched the iPhone photo app craze. It’s coming up on 4 years now since CameraBag first came out - my how the time flies!
Many of the current conventions in modern photo apps originated in CameraBag. For example, CameraBag was the first photo app to focus on emulating vintage film cameras rather than just offering typical photo software effects like saturation, contrast, or sepia. Before the iPhone was released, I’d hoped in vain that some compact camera maker would offer this kind of functionality built into a regular digital camera. Both John and I had really enjoyed taking photos since digital cameras came on the scene, so I was looking at this idea as something that I really wanted as a camera user. When the iPhone SDK was announced (remember that the iPhone didn’t have 3rd party apps until a year after it was released), I instantly thought of my wish for a camera that focused on film styles. The fact that the iPhone camera was so bad on its own really helped give the idea juice, because we could make those photos interesting!
This was happening somewhere during the summer of 2008. Our company had been going for about 5 years at that point, and we only had one product: Silo, our 3D modeling program. Silo is what started our company - I initially created it as a final project during my senior year at Pomona College, and then spent the year after college and before I was to start architecture school working on it as a tool to use for my own architectural designs in grad school. I ended up dropping out of architecture school and focusing on software development, but then decided to go back for a masters degree in design computation at MIT a few years later. I had just finished up that degree when the summer of 2008 came along and saw the chance to make this photo app.
Here’s one of my original test photos taken with the prototype of CameraBag, which I took on a boring road in Seattle. This was an early version of the Helga filter.
CameraBag was initially going to be a little side project while we dove back into full-time work on Silo after I finished my masters degree. I threw the first version together in probably 3 or 4 weeks. I remember I was actually a bit slow getting it going, but then I heard through a friend that a guy he knew was making tens of thousands of dollars a month off of a really boring-sounding app, and that lit a fire under me and John both to get the app out ASAP. I was getting concerned because I saw that there were something like 30 photo apps in the app store, and I wanted to be sure to get CameraBag done before there were much more than that. There’s got to be what, like 50,000 photo apps now?
Another iPhone photo app convention that was launched with CameraBag was the friendly naming of filters. Instead of calling our black and white filter “B&W”, we called in “Ansel”. Sadly, this ended up bothering some people with lawyers, so we had to change that one to boring old “Mono.” But we used that kind of filter naming where we could, and we still do. This wasn’t supposed to be scientifically accurate film emulation, this was supposed to be fun. The first release of CameraBag had 5 filters, as I recall: Ansel, Helga, Instant, 1974, and 1962. I had never really done much in the way of writing image processing code, but the concepts are simple enough that, as I said, it only took a few weeks to throw it together. It would have gone faster if Apple’s required programming language, Objective C, weren’t so terrible. In another post at some point, I’m going to write about the irony of Apple’s products being so user-friendly and well-designed on the outside, while the innards of their programming language is like a rotten core. Microsoft, on the other hand, has generally bad user interfaces but a really beautiful programming language in C#. Fortunately, Apple lets you write stuff like image processing code in C++. One of the reasons nobody is writing apps for the WIndows phone is that you have to use only C#, which is not compatible with other platforms. I digress.
We had some debate about the naming of the app. In particular, John thought that “CameraBag” didn’t sound like an app you would use to take photos. He still thinks it would have been better named something else. I guess we’ll never know. However, soon after it launched on the app store we stopped really worrying about the name. More about the launch in the next post :].