The Test Tube – Speed Dating meets User Experience Testing

One of the things I love about working out of WeWork Labs is coming across people in the technology world I wouldn’t normally meet, and finding out about what they’re up to. A couple of the members here run a meetup called The Test Tube (twitter at @testtubenyc). Its elevator pitch is ‘Speed Dating meets User Experience Testing’ which at first sounded like something I wouldn’t be that interested in. Other WeWorkers (yeah, I just did that, sorry) were highly complimentary of it though so I decided to try it out. And I’m very glad I did.

To set a little context here – I’m working on a brand new product with a friend of mine. We’re about 3 weeks in and have a very early, very rough, prototype of a small amount of what we want in the MVP. I thought it was too nascent to be able to get user feedback on but I was convinced by Pierre Wooldridge, one of the Test Tube organizers, to try out his meetup anyway.

Taking place this time at Gilt’s office in midtown about 50 people were there. After brief introductions and a short talk from one of Gilt’s UX people we got down to business. Here’s how it works:

  • Everyone there is organized into a first pair, who I’ll refer to as Ms Green and Mr Red.
  • Ms Green has 7 minutes to get feedback on her product from Mr Red.
  • Ms Green starts by giving the briefest context possible, and by describing the scenario she’d like Mr Red to try to work through.
  • Mr Red then uses the product, vocalizing his thought process as he goes.
  • When there’s about a minute left they’ll try to summarize the experience.
  • Ms Green and Mr Red then swap roles, giving Mr Red 7 minutes to get feedback on his product from Ms Green.
  • After both people have gone through the process all pairs are rotated (a strict clock is kept in the room) and the process is repeated 4 more times, giving each person 5 different opportunities to get feedback.

I’ve never done user experience (UX) testing before with people I didn’t already know and found the process absolutely fascinating. Even with the extremely raw product we currently have there was enough there for our opposites to give what in their minds were just their first feelings but in ours’ was insightful feedback. As an example from 4 of the 5 rotations one of the most basic assumptions that I’d already made about the product, which affects the very first screen of the application, turned out not to fit people’s expectations.

One of the truly brilliant aspects of The Test Tube is the time constraint. Not knowing the people you’re sitting with could lead to social awkwardness and hesitancy. But with only 7 minutes you’ve got no time for that and so you’re forced to plough straight in. Furthermore since there’s only 15 minutes per pair you can get 5 completely different sets of feedback in less than an hour and a half – brilliant!

I’d like to congratulate Pierre and Tom on a fantastic idea, well executed. I’d whole heartedly recommend The Test Tube to other NYC software product developers, whether in startups or established businesses.

Building a Clojure project on Travis CI that’s in a sub-directory of the repository

This wasn’t entirely obvious to me, so here goes.

First some very quick background. Travis CI is a wonderful, hosted continuous integration service, that is free and very easy to use for open source projects on Github.

I have a Clojure project on github that I want to build, but it’s in a sub-directory of its parent repository. It took me a few attempts to have Travis handle this correctly, but in the end it was simply a matter of doing the following in the .travis.yml file:

What doesn’t work (and which I tried before realizing what’s going on) is using before_script, or putting the ‘cd‘ call within a script line itself. This doesn’t work because Travis runs ‘lein deps’ after before_install, but before before_script (and therefore before script) and thus fails if you haven’t already changed to the directory where your project.clj is by this point.

My full .travis.yml at time of writing is as follows:

Clojurenote – A Clojure library to access the Evernote API

Recently I’ve been having a lot of fun using the Evernote API from Clojure, especially as part of developing Orinoco. I’ve now open-sourced my lowest-level Evernote code as Clojurenote.

Evernote already provides a thorough Java SDK and Clojurenote doesn’t aim to completely replace it. Clojurenote does, however, implement the following:

  • OAuth authentication (using the fantastic Scribe Java library)
  • Basic read/write capabilities, using an OAuth access token, or developer token
  • ENML to HTML translation (ENML is the XHTML derivative that is used for the content of Evernote notes)

You’ll still need to be happy mucking in with the Java SDK but Clojurenote will make a few things cleaner and easier for you in your Evernote-accessing Clojure code.

Why Evernote?

I’ve been using Evernote a lot recently. In fact, I’ve gone a little bit overboard. Being an every-day human user wasn’t enough so I started writing applications that use Evernote. And I went to their conference. And I’m even in the analysis stages of starting a business based on Evernote. But why? Here I give a few reasons that hopefully give some method to my Evernote madness.

Firstly a quick introduction for those who don’t know what Evernote is. It’s a tool for easily capturing pretty much any thought or content that can be put into an electronic document, tracking it however you like, from any device, on-line or off-line, with the ability to share with other people and applications.

Urrk, that’s not as quick as I’d like. I wanted to the description smaller but it’s precisely the combination of all these things that make Evernote so compelling to me. I’ll try to break them down a little.

I use Evernote for storing (at least) to-do items, reminders, emails I want to read later, web pages I want to save, travel plans, project ideas, a journal, restaurants I might want to go to with my wife, workout history, and a whole lot more. Evernote themselves like to use the phrase ‘outsource your memory’ – that describes my usage well.

One of the things I particularly like about Evernote is that it lets you organize things how you want, not how it wants. Evernote gives you a blank slate, a few cataloguing criteria, and then gets out of the way. The genius is that there’s enough to be powerful, but also little enough for you to organize, and change your organization process, as you see fit. This feature of Evernote is actually what causes a lot of people to struggle with it at first (and it took me a while to get used to), but frequent Evernote users are often of the type that once they get momentum then they find more opportunities to increase the efficiency of their lives through even further usage of Evernote.

I use Evernote wherever I am. I primarily use the Mac Desktop app, but I also use the iPhone / iPad app, the web clipper, the email bridge, etc. The off-line capability of the iPhone app is great – I can check my important to-dos from the subway and then read some articles I’ve saved.

Sharing is useful too. As a premium user I setup notebooks with my wife that we can both edit. We typically use this for restaurants and bars that we want to try out, but shopping lists are another common usage for other people. And that’s before you even start considering the business usages of shared notebooks.

Also of huge value is Evernote’s API. It means than many apps beyond Evernote’s own can integrate with my Evernote data. I’m a big fan of using Drafts on the iPhone to create content on the go. My own app, Orinoco, lets me see my journal in a more natural way and lets me capture my tweets, foursquare checkins, etc., automatically into Evernote.

Finally, I like Evernote the company and trust them with my data. Evernote rejects all indirect revenue – they only make money from premium and business subscriptions. This means they don’t have ads, and don’t do data mining on my content. They also want to be around a long time and not just ‘exit’ at the earliest possible moment.

These are all just reasons why I value Evernote as a user. There are other reasons I value Evernote as a developer and I’ll write those up another time.

(cross-posted at

Connecting to a remote PostgreSQL Database on Heroku from Clojure

Sometimes it’s useful to be able to debug a local application running against a remote production (or staging) database. The app I’m currently working on is a Clojure app, running on Heroku, using PostgreSQL as a database. It wasn’t entirely obvious (to me) how to do this, but here’s how I did it in the end.

  1. First, this assumes you’re using at least [org.clojure/java.jdbc “0.2.3”] . I thought at first it required later, but 0.2.3 seems to work.
  2. Get your regular Heroku DB URL. This will be of the form ‘postgres://[user]:[password]@[host]:[port]/[database]’
  3. Form a new DB URL as follows (substituting in the tokens from above) : ‘jdbc:postgresql://[host]:[port]/[database]?user=[user]&password=[password]&ssl=true&sslfactory=org.postgresql.ssl.NonValidatingFactory’.
  4. Then for your db connection, use the map ‘{:connection-uri “[new-url]”}’

If I was going to do this frequently it would be easy to create a function to map the original Heroku URL to this remote debugging URL. Assuming you’ve parsed out the server, port, etc., the following gist will work as a basis for this.

My Evernote Conference 2013

The Evernote Conference (EC 2013), which happened last week (Sept 26, 27) in San Francisco, was not my usual conference. Typically I go to events that are mostly or solely geared around software development, that I’ve heard good things of directly from friends or colleagues, and where I know I’ll come across a few people I know. EC 2013 had none of these. So why did I go? And how did it turn out? I’ll give you the skinny – I didn’t get quite all that I hoped, but got more than I expected. For more, read on.

I’m in the early days of starting my own business. I’m a big fan of Evernote both as a basic app and as an integration platform. It fulfills a number of needs for me – organization, planning, content archiving, ‘read later’ items, list sharing (with my wife), etc. It’s also the backing platform for my journaling app – Orinoco. In Evernote’s first 5 years of existence it’s been very successful in its own right but the third-party application and integration ecosystem has in my mind been surprisingly sparse. I see this as an opportunity.

I went to EC 2013 with 3 main goals:

  • Get a good idea of concrete numbers of active Evernote users and Evernote Business customers
  • Get a better understanding of Evernote as a company – are they a group of people that I believe can continue to produce a platform successful enough that I want to build on it?
  • Networking with Evernote employees, other 3rd party developers, and potential customers for any business focussed work I may pursue

EC 2013 was Evernote’s 3rd annual public conference. The first 2 were primarily developer focussed but this year they opened up the theme to be much more user oriented. There were plenty of developer sessions though, and Evernote’s app competition – Dev Cup – was a big part of the event.

The morning keynotes each day were mostly geared around product launches. The first day’s was consumer focussed (including somewhat strange launches for Evernote-partnered bag manufacturing as part of the new Evernote Market), the second’s business focussed (centering on the new integration.)

The evening keynotes were both fascinating on one hand (Chris Anderson talking about his drone company 3D Robotics) and disappointing on the other (David Allen giving an overview of the thinking behind Getting Things Done, without adding anything significant that couldn’t be understood from his 10+ year old book.)

There were some decent breakout sessions. Evernote’s own Seth Hitchings gave a good ‘State of the Platform’ talk, giving some useful data of where things stand with the Evernote Platform (the central storage / search / synchronization infrastructure that all Evernote and 3rd party apps integrate with), plus also some useful announcements of things that are coming (support for reminders from the API; allowing users to give 3rd party apps access to only part of their Evernote account, etc.) Julien Boëdec (partner integrations manager at Evernote) gave a great, concise, workshop session on Evernote Business integration which included some descriptions of some actual 3rd party integrations with Evernote Business.

My favorite part though was, as is common with me and conferences, the time between the sessions chatting to all different types of people. I met a good number of Evernote’s own employees (I’m pretty certain that most, if not all, of the company were there) including a couple of product managers, developers, their head of business sales, developer relations folk, etc. My takeaway from all of those conversations was that Evernote is built on a bunch of enthusiastic, smart, decent people. As an example I spent an enjoyable and enlightening hour or so with one of their developers chatting about various API concerns.

So what about my original goals?

  • Evernote have 75 million registered users. Unsurprisingly, but disappointingly, I couldn’t get a concrete number for active users but I did hear something from someone that said it was in the 15 million range. I didn’t get any detail if that was monthly, annually, or what. I’d really like to know how many people access Evernote at least once per month. 7900 companies have bought Evernote Business, but they weren’t going into much more detail than that (I’d like to know how many have at least 20 users, at least 100 users, etc.)
  • As I said above all the people I met from Evernote came across as smart and enthusiastic. They are also capable – the new iOS 7 client was a complete re-write, going from conception to delivery, on a pretty new iOS platform, in 3 months. I dread to think the hours they were pulling to make that happen (their iOS dev team is not big) but that’s still damn impressive.
  • I’m not as gregarious as I could be but I still met plenty of folks there across the 3 categories I was concerned with.

That adds up to a decent result. Not perfect, but good.

What I also got though, and that I didn’t expect, was a really good feeling I’m on the right track. Of course everyone at the conference was an Evernote enthusiast but this is product, and platform, that has massive appeal across a broad swath of companies, individuals and technical savviness. I showed off Orinoco to a bunch of people and the feedback was universally positive. Either everyone is super nice when they’re on the west coast or this is something that shows promise.

I still don’t know the precise focus of where I want to end up (that’s what iterating is for, right?) but what the Evernote Conference showed me was that building off their platform ain’t a bad place to start.

(cross posted at

20 years of Desktop Computers

This week I sold my iMac. I now no longer own a desktop computer, for the first time in 20 years. I’ll get to why at the end, but I thought it might be fun to take a look back over these 2 decades of my personal PC history.

My first PC looked exactly like thisMy first computer was an IBM PS/1. It was 1993 and I was 14 years old. There was no computer in my house growing up before this time so I’d been spending a lot of time in the school computer lab.

This PS/1 was a great machine. Its initial spec was 486 DX 33 processor, 4MB RAM, 170 MB hard disk with a 14″ monitor which I typically ran at 800×600 resolution. For the time this was a pretty decent set of hardware. It ran Windows 3.1 and IBM’s ‘PC-DOS‘ 5 (simply MS-DOS 5 rebranded.) It never developed a fault while I was using it.

It was my PC through to the end of my time at high school. It had a few upgrades over these years including a sound card (natively it only had the fairly useless PC speaker), a RAM upgrade to 8MB, a hard disk upgrade to 420 MB, adding a CD-ROM drive and various OS updates, the last being Windows 95.

By the summer of ’96 it was time for me to go to University, and I bought a new PC for the move. This was the first PC I built myself and from here on for the best part of a decade I was forever tinkering with my computers. As such my memories of specs get a little hazy. I do remember that the original incarnation of my first University PC had a Cyrix 6×86 processor – this was a mistake. The Cyrix chip was slow and crashed frequently (apparently they had heat problems.) I suffered through for the best part of a year before giving up and getting a new CPU and motherboard.

In the first year at college I networked my PC for the first time – using a very long serial-port null modem cable to my friend (of then and now) Mike Mason‘s computer, who lived in a room about 50 feet away. We played Quake against each other and also amazed ourselves at being able to eject the CD-ROM drive of the other person’s machine. We clearly needed to get out more. Around this time I started using Linux, starting with the Slackware distribution.

Zip DriveIn our second year at college Mike and I shared a house with a few friends, so the networking in my machine got upgraded to have a BNC ethernet card. It was around this time that storing music as MP3 format started – it used to take 2 – 3 hours to rip a CD on my computer. Winamp was the music player of choice. I had an Iomega Zip drive which allowed me for the first time to move my data around physically on something with more capacity than a 1.44MB floppy disk. The Zip drive, like so many of the time, was thrown out when it developed the infamous ‘click death’. USB memory sticks are far superior.

In my third and final year at college I moved back into college accommodation which came with a LAN-speed internet connection. This was a huge benefit. I was pretty concerned about having my PC hooked up straight to the internet though so I bought a second desktop PC to act mostly as a firewall and secondarily as my Linux machine. This required me to get a bit more serious about my Linux usage – I wouldn’t like to guess how much time I spent recompiling kernels that year.

A bunch of us across the university had network-connected PCs and setup what we would now call a VPN using a combination of CIPE and NFS. With this we could transfer files between each other without the university network being able to tell what we were doing. We were very proud of ourselves for doing this and still needed to get out more.

Shuttle PCI continued tinkering with my tower-case enclosed PCs over the first 3 years or so after college. I also bought my first laptop around this time. In 2002-ish I bought my first Shuttle ‘small form factor’ PC. This was a speedy machine, and also (for the time) tiny. I added a 2nd Shuttle to allow further tinkering.

In 2003 through spring 2006 I did a lot of moving of countries so my desktop computing adventures slowed down here. In ’05 I bought my 2nd laptop, my 2nd Sony Vaio. At the end of 2005 I did buy my first Mac – a G4 powered Mac Mini but this was mostly as a ‘media PC’ rather than a serious desktop computer.

In late 06, now living in New York, I bought my last desktop computer – a Core 2 Duo powered iMac.  The biggest (literally) thing about this machine to me was its screen, all 24 inches of it which to me at the time seemed stupidly huge. This was also the first time I seriously used a Mac. Despite being frustrated at first with Mac OS I soon got the hang of it and wondered what I’d been doing to take so long to start using a Mac.

iMacThe iMac was great and I was still using it through until last summer – not bad for a 6 year old machine. In this period I only upgraded one piece of hardware – giving it a slight RAM upgrade to 3GB. My years of constant hardware tinkering were over.

Last summer I bought my 3rd laptop – a fully specced MacBook Air. This machine is screamingly fast and hooked up to a 30″ display easily does everything I need. The iMac was consigned to the floor of the spare room where it sat until this week and I sold it.

I still find it amazing that a machine of the diminutive proportions of my MacBook Air can perform like it does. Comparing it with my first machine it has a CPU roughly 500 times more powerful (by BogoMips), 2000 times more memory and 3000 times more disk space (and that’s with using an SSD). Truly we are living in the future.