Category Archives: Tech

The Test Tube – Speed Dating meets User Experience Testing

One of the things I love about working out of WeWork Labs is coming across people in the technology world I wouldn’t normally meet, and finding out about what they’re up to. A couple of the members here run a meetup called The Test Tube (twitter at @testtubenyc). Its elevator pitch is ‘Speed Dating meets User Experience Testing’ which at first sounded like something I wouldn’t be that interested in. Other WeWorkers (yeah, I just did that, sorry) were highly complimentary of it though so I decided to try it out. And I’m very glad I did.

To set a little context here – I’m working on a brand new product with a friend of mine. We’re about 3 weeks in and have a very early, very rough, prototype of a small amount of what we want in the MVP. I thought it was too nascent to be able to get user feedback on but I was convinced by Pierre Wooldridge, one of the Test Tube organizers, to try out his meetup anyway.

Taking place this time at Gilt’s office in midtown about 50 people were there. After brief introductions and a short talk from one of Gilt’s UX people we got down to business. Here’s how it works:

  • Everyone there is organized into a first pair, who I’ll refer to as Ms Green and Mr Red.
  • Ms Green has 7 minutes to get feedback on her product from Mr Red.
  • Ms Green starts by giving the briefest context possible, and by describing the scenario she’d like Mr Red to try to work through.
  • Mr Red then uses the product, vocalizing his thought process as he goes.
  • When there’s about a minute left they’ll try to summarize the experience.
  • Ms Green and Mr Red then swap roles, giving Mr Red 7 minutes to get feedback on his product from Ms Green.
  • After both people have gone through the process all pairs are rotated (a strict clock is kept in the room) and the process is repeated 4 more times, giving each person 5 different opportunities to get feedback.

I’ve never done user experience (UX) testing before with people I didn’t already know and found the process absolutely fascinating. Even with the extremely raw product we currently have there was enough there for our opposites to give what in their minds were just their first feelings but in ours’ was insightful feedback. As an example from 4 of the 5 rotations one of the most basic assumptions that I’d already made about the product, which affects the very first screen of the application, turned out not to fit people’s expectations.

One of the truly brilliant aspects of The Test Tube is the time constraint. Not knowing the people you’re sitting with could lead to social awkwardness and hesitancy. But with only 7 minutes you’ve got no time for that and so you’re forced to plough straight in. Furthermore since there’s only 15 minutes per pair you can get 5 completely different sets of feedback in less than an hour and a half – brilliant!

I’d like to congratulate Pierre and Tom on a fantastic idea, well executed. I’d whole heartedly recommend The Test Tube to other NYC software product developers, whether in startups or established businesses.

Building a Clojure project on Travis CI that’s in a sub-directory of the repository

This wasn’t entirely obvious to me, so here goes.

First some very quick background. Travis CI is a wonderful, hosted continuous integration service, that is free and very easy to use for open source projects on Github.

I have a Clojure project on github that I want to build, but it’s in a sub-directory of its parent repository. It took me a few attempts to have Travis handle this correctly, but in the end it was simply a matter of doing the following in the .travis.yml file:

What doesn’t work (and which I tried before realizing what’s going on) is using before_script, or putting the ‘cd‘ call within a script line itself. This doesn’t work because Travis runs ‘lein deps’ after before_install, but before before_script (and therefore before script) and thus fails if you haven’t already changed to the directory where your project.clj is by this point.

My full .travis.yml at time of writing is as follows:

Clojurenote – A Clojure library to access the Evernote API

Recently I’ve been having a lot of fun using the Evernote API from Clojure, especially as part of developing Orinoco. I’ve now open-sourced my lowest-level Evernote code as Clojurenote.

Evernote already provides a thorough Java SDK and Clojurenote doesn’t aim to completely replace it. Clojurenote does, however, implement the following:

  • OAuth authentication (using the fantastic Scribe Java library)
  • Basic read/write capabilities, using an OAuth access token, or developer token
  • ENML to HTML translation (ENML is the XHTML derivative that is used for the content of Evernote notes)

You’ll still need to be happy mucking in with the Java SDK but Clojurenote will make a few things cleaner and easier for you in your Evernote-accessing Clojure code.

Why Evernote?

I’ve been using Evernote a lot recently. In fact, I’ve gone a little bit overboard. Being an every-day human user wasn’t enough so I started writing applications that use Evernote. And I went to their conference. And I’m even in the analysis stages of starting a business based on Evernote. But why? Here I give a few reasons that hopefully give some method to my Evernote madness.

Firstly a quick introduction for those who don’t know what Evernote is. It’s a tool for easily capturing pretty much any thought or content that can be put into an electronic document, tracking it however you like, from any device, on-line or off-line, with the ability to share with other people and applications.

Urrk, that’s not as quick as I’d like. I wanted to the description smaller but it’s precisely the combination of all these things that make Evernote so compelling to me. I’ll try to break them down a little.

I use Evernote for storing (at least) to-do items, reminders, emails I want to read later, web pages I want to save, travel plans, project ideas, a journal, restaurants I might want to go to with my wife, workout history, and a whole lot more. Evernote themselves like to use the phrase ‘outsource your memory’ – that describes my usage well.

One of the things I particularly like about Evernote is that it lets you organize things how you want, not how it wants. Evernote gives you a blank slate, a few cataloguing criteria, and then gets out of the way. The genius is that there’s enough to be powerful, but also little enough for you to organize, and change your organization process, as you see fit. This feature of Evernote is actually what causes a lot of people to struggle with it at first (and it took me a while to get used to), but frequent Evernote users are often of the type that once they get momentum then they find more opportunities to increase the efficiency of their lives through even further usage of Evernote.

I use Evernote wherever I am. I primarily use the Mac Desktop app, but I also use the iPhone / iPad app, the web clipper, the email bridge, etc. The off-line capability of the iPhone app is great – I can check my important to-dos from the subway and then read some articles I’ve saved.

Sharing is useful too. As a premium user I setup notebooks with my wife that we can both edit. We typically use this for restaurants and bars that we want to try out, but shopping lists are another common usage for other people. And that’s before you even start considering the business usages of shared notebooks.

Also of huge value is Evernote’s API. It means than many apps beyond Evernote’s own can integrate with my Evernote data. I’m a big fan of using Drafts on the iPhone to create content on the go. My own app, Orinoco, lets me see my journal in a more natural way and lets me capture my tweets, foursquare checkins, etc., automatically into Evernote.

Finally, I like Evernote the company and trust them with my data. Evernote rejects all indirect revenue - they only make money from premium and business subscriptions. This means they don’t have ads, and don’t do data mining on my content. They also want to be around a long time and not just ‘exit’ at the earliest possible moment.

These are all just reasons why I value Evernote as a user. There are other reasons I value Evernote as a developer and I’ll write those up another time.

(cross-posted at http://mikeroberts.postach.io/why-evernote)

Connecting to a remote PostgreSQL Database on Heroku from Clojure

Sometimes it’s useful to be able to debug a local application running against a remote production (or staging) database. The app I’m currently working on is a Clojure app, running on Heroku, using PostgreSQL as a database. It wasn’t entirely obvious (to me) how to do this, but here’s how I did it in the end.

  1. First, this assumes you’re using at least [org.clojure/java.jdbc "0.2.3"] . I thought at first it required later, but 0.2.3 seems to work.
  2. Get your regular Heroku DB URL. This will be of the form ‘postgres://[user]:[password]@[host]:[port]/[database]‘
  3. Form a new DB URL as follows (substituting in the tokens from above) : ‘jdbc:postgresql://[host]:[port]/[database]?user=[user]&password=[password]&ssl=true&sslfactory=org.postgresql.ssl.NonValidatingFactory’.
  4. Then for your db connection, use the map ‘{:connection-uri “[new-url]“}’

If I was going to do this frequently it would be easy to create a function to map the original Heroku URL to this remote debugging URL. Assuming you’ve parsed out the server, port, etc., the following gist will work as a basis for this.

My Evernote Conference 2013

The Evernote Conference (EC 2013), which happened last week (Sept 26, 27) in San Francisco, was not my usual conference. Typically I go to events that are mostly or solely geared around software development, that I’ve heard good things of directly from friends or colleagues, and where I know I’ll come across a few people I know. EC 2013 had none of these. So why did I go? And how did it turn out? I’ll give you the skinny – I didn’t get quite all that I hoped, but got more than I expected. For more, read on.

I’m in the early days of starting my own business. I’m a big fan of Evernote both as a basic app and as an integration platform. It fulfills a number of needs for me – organization, planning, content archiving, ‘read later’ items, list sharing (with my wife), etc. It’s also the backing platform for my journaling app - Orinoco. In Evernote’s first 5 years of existence it’s been very successful in its own right but the third-party application and integration ecosystem has in my mind been surprisingly sparse. I see this as an opportunity.

I went to EC 2013 with 3 main goals:

  • Get a good idea of concrete numbers of active Evernote users and Evernote Business customers
  • Get a better understanding of Evernote as a company – are they a group of people that I believe can continue to produce a platform successful enough that I want to build on it?
  • Networking with Evernote employees, other 3rd party developers, and potential customers for any business focussed work I may pursue

EC 2013 was Evernote’s 3rd annual public conference. The first 2 were primarily developer focussed but this year they opened up the theme to be much more user oriented. There were plenty of developer sessions though, and Evernote’s app competition - Dev Cup - was a big part of the event.

The morning keynotes each day were mostly geared around product launches. The first day’s was consumer focussed (including somewhat strange launches for Evernote-partnered bag manufacturing as part of the new Evernote Market), the second’s business focussed (centering on the new Salesforce.com integration.)

The evening keynotes were both fascinating on one hand (Chris Anderson talking about his drone company 3D Robotics) and disappointing on the other (David Allen giving an overview of the thinking behind Getting Things Done, without adding anything significant that couldn’t be understood from his 10+ year old book.)

There were some decent breakout sessions. Evernote’s own Seth Hitchings gave a good ‘State of the Platform’ talk, giving some useful data of where things stand with the Evernote Platform (the central storage / search / synchronization infrastructure that all Evernote and 3rd party apps integrate with), plus also some useful announcements of things that are coming (support for reminders from the API; allowing users to give 3rd party apps access to only part of their Evernote account, etc.) Julien Boëdec (partner integrations manager at Evernote) gave a great, concise, workshop session on Evernote Business integration which included some descriptions of some actual 3rd party integrations with Evernote Business.

My favorite part though was, as is common with me and conferences, the time between the sessions chatting to all different types of people. I met a good number of Evernote’s own employees (I’m pretty certain that most, if not all, of the company were there) including a couple of product managers, developers, their head of business sales, developer relations folk, etc. My takeaway from all of those conversations was that Evernote is built on a bunch of enthusiastic, smart, decent people. As an example I spent an enjoyable and enlightening hour or so with one of their developers chatting about various API concerns.

So what about my original goals?

  • Evernote have 75 million registered users. Unsurprisingly, but disappointingly, I couldn’t get a concrete number for active users but I did hear something from someone that said it was in the 15 million range. I didn’t get any detail if that was monthly, annually, or what. I’d really like to know how many people access Evernote at least once per month. 7900 companies have bought Evernote Business, but they weren’t going into much more detail than that (I’d like to know how many have at least 20 users, at least 100 users, etc.)
  • As I said above all the people I met from Evernote came across as smart and enthusiastic. They are also capable – the new iOS 7 client was a complete re-write, going from conception to delivery, on a pretty new iOS platform, in 3 months. I dread to think the hours they were pulling to make that happen (their iOS dev team is not big) but that’s still damn impressive.
  • I’m not as gregarious as I could be but I still met plenty of folks there across the 3 categories I was concerned with.

That adds up to a decent result. Not perfect, but good.

What I also got though, and that I didn’t expect, was a really good feeling I’m on the right track. Of course everyone at the conference was an Evernote enthusiast but this is product, and platform, that has massive appeal across a broad swath of companies, individuals and technical savviness. I showed off Orinoco to a bunch of people and the feedback was universally positive. Either everyone is super nice when they’re on the west coast or this is something that shows promise.

I still don’t know the precise focus of where I want to end up (that’s what iterating is for, right?) but what the Evernote Conference showed me was that building off their platform ain’t a bad place to start.

(cross posted at http://mikeroberts.postach.io/my-evernote-conference-2013)

20 years of Desktop Computers

This week I sold my iMac. I now no longer own a desktop computer, for the first time in 20 years. I’ll get to why at the end, but I thought it might be fun to take a look back over these 2 decades of my personal PC history.

My first PC looked exactly like thisMy first computer was an IBM PS/1. It was 1993 and I was 14 years old. There was no computer in my house growing up before this time so I’d been spending a lot of time in the school computer lab.

This PS/1 was a great machine. Its initial spec was 486 DX 33 processor, 4MB RAM, 170 MB hard disk with a 14″ monitor which I typically ran at 800×600 resolution. For the time this was a pretty decent set of hardware. It ran Windows 3.1 and IBM’s ‘PC-DOS‘ 5 (simply MS-DOS 5 rebranded.) It never developed a fault while I was using it.

It was my PC through to the end of my time at high school. It had a few upgrades over these years including a sound card (natively it only had the fairly useless PC speaker), a RAM upgrade to 8MB, a hard disk upgrade to 420 MB, adding a CD-ROM drive and various OS updates, the last being Windows 95.

By the summer of ’96 it was time for me to go to University, and I bought a new PC for the move. This was the first PC I built myself and from here on for the best part of a decade I was forever tinkering with my computers. As such my memories of specs get a little hazy. I do remember that the original incarnation of my first University PC had a Cyrix 6×86 processor – this was a mistake. The Cyrix chip was slow and crashed frequently (apparently they had heat problems.) I suffered through for the best part of a year before giving up and getting a new CPU and motherboard.

In the first year at college I networked my PC for the first time – using a very long serial-port null modem cable to my friend (of then and now) Mike Mason‘s computer, who lived in a room about 50 feet away. We played Quake against each other and also amazed ourselves at being able to eject the CD-ROM drive of the other person’s machine. We clearly needed to get out more. Around this time I started using Linux, starting with the Slackware distribution.

Zip DriveIn our second year at college Mike and I shared a house with a few friends, so the networking in my machine got upgraded to have a BNC ethernet card. It was around this time that storing music as MP3 format started – it used to take 2 – 3 hours to rip a CD on my computer. Winamp was the music player of choice. I had an Iomega Zip drive which allowed me for the first time to move my data around physically on something with more capacity than a 1.44MB floppy disk. The Zip drive, like so many of the time, was thrown out when it developed the infamous ‘click death’. USB memory sticks are far superior.

In my third and final year at college I moved back into college accommodation which came with a LAN-speed internet connection. This was a huge benefit. I was pretty concerned about having my PC hooked up straight to the internet though so I bought a second desktop PC to act mostly as a firewall and secondarily as my Linux machine. This required me to get a bit more serious about my Linux usage – I wouldn’t like to guess how much time I spent recompiling kernels that year.

A bunch of us across the university had network-connected PCs and setup what we would now call a VPN using a combination of CIPE and NFS. With this we could transfer files between each other without the university network being able to tell what we were doing. We were very proud of ourselves for doing this and still needed to get out more.

Shuttle PCI continued tinkering with my tower-case enclosed PCs over the first 3 years or so after college. I also bought my first laptop around this time. In 2002-ish I bought my first Shuttle ‘small form factor’ PC. This was a speedy machine, and also (for the time) tiny. I added a 2nd Shuttle to allow further tinkering.

In 2003 through spring 2006 I did a lot of moving of countries so my desktop computing adventures slowed down here. In ’05 I bought my 2nd laptop, my 2nd Sony Vaio. At the end of 2005 I did buy my first Mac – a G4 powered Mac Mini but this was mostly as a ‘media PC’ rather than a serious desktop computer.

In late 06, now living in New York, I bought my last desktop computer – a Core 2 Duo powered iMac.  The biggest (literally) thing about this machine to me was its screen, all 24 inches of it which to me at the time seemed stupidly huge. This was also the first time I seriously used a Mac. Despite being frustrated at first with Mac OS I soon got the hang of it and wondered what I’d been doing to take so long to start using a Mac.

iMacThe iMac was great and I was still using it through until last summer – not bad for a 6 year old machine. In this period I only upgraded one piece of hardware – giving it a slight RAM upgrade to 3GB. My years of constant hardware tinkering were over.

Last summer I bought my 3rd laptop – a fully specced MacBook Air. This machine is screamingly fast and hooked up to a 30″ display easily does everything I need. The iMac was consigned to the floor of the spare room where it sat until this week and I sold it.

I still find it amazing that a machine of the diminutive proportions of my MacBook Air can perform like it does. Comparing it with my first machine it has a CPU roughly 500 times more powerful (by BogoMips), 2000 times more memory and 3000 times more disk space (and that’s with using an SSD). Truly we are living in the future.

Asking better questions

I didn’t know Aaron Swartz. I met him very briefly in December but that was all. Nevertheless it’s been a realization this last week and a half hearing from those that did know him what an amazing human he was, and how much of a loss there is for the world in his passing away too soon.

I watched online some of the memorial for Aaron that took place in New York last Saturday. I was most impressed and moved by the last speech, from his partner Taren Stinebrickner-Kauffman. There was much in what she said about the legal pressures surrounding Aaron’s last year, but what resonated most with me was this section:

Aaron didn’t believe he was smarter than anyone else, which is hard for — it was very hard for me to accept that he really believed that. He really, really believed that he was not smarter than anybody else. He just thought he asked better questions.

He believed that every single person in this room is capable of doing as much as he did, if you just ask the right questions.

Whatever you’re doing, are you confused? Is there anything that doesn’t quite make sense about what you’re doing? What is it? Never assume that someone else has noticed that niggling sense of doubt and already resolved the issue for themselves. They haven’t. The world does not make sense, and if you think it does it’s because you’re not asking hard enough questions.

If you’re in the tech sector, why are you there? What do you really believe in? If you believe that technology is making the world a better place, why do you believe that? Do you really understand what makes the world a bad place to begin with?

I’m serious. If you’re in this room and you work in the technology sector, I’m asking you that question. Do you understand what makes the world a bad place to begin with? Have you ever spent time with and listened to the people your technology is supposed to be helping? Or the people it might be hurting?

While I do believe that much needs to be done with regard to the unfairness of Aaron’s trial, there is little I can personally do about that. But the calling above is something that we all in the software development world can consider. If some of us act on this then Aaron’s passing will be a little less in vain.

The video of Aaron’s NY memorial is here. Taren’s speech is at about the 1:47 mark. Thanks to Chris Burkhardt for transcribing Taren’s speech – the full text is available here. My sympathies go out to all of Aaron’s family, friends and colleagues at this time.

Agile people over agile process

In June 2012 I gave a talk at QCon New York titled ‘Agile people over Agile process’. The full talk is here, and below are some of my thoughts on this topic.

Summary

What’s below is pretty long so if you don’t want to read it all, here’s the essence of my opinion.

In the ‘agile world’ these days I see a decent amount of pre-defined, unarguable, process and dogma – the very things that the agile movement initially tried to do away with. I think it’s worth stepping away from this and focussing first on individuals, how they communicate, and how as a team they best choose their techniques and tools

There are no such things as ‘best practices’, at least when it comes to being part of a software team or software project. There are practices that teams find useful within their context, but each context is different. Teams would do well to continually re-judge what process and tooling works best for them.

Agile teams can use values and principles to help drive their choice of process and tools.

So let’s begin…

There’s too much focus on process

When I got started in the agile world 10+ years ago we used to talk a lot about extreme programming (XP), Scrum, and the like. Obviously part of that was figuring out test driven development, pair programming, continuous integration, iterations, etc. A lot of it was also about how we needed to change as individuals though. Gone were the times where we could just sit in our cubicle and complete our individual tasks on the almighty gantt chart. No longer could we assume that we didn’t need to test code because that was someone else’s responsibility. We needed to embrace how we worked as a collaborative team, and not just argue over Emacs vs Vi. This was a revolution in how we identified as humans on a software project.

People back then accused XP of being a developer-focused methodology, and they were right, but this was with good reason. For developers to be most effective they needed to stop just being pawns in a bigger process and start talking to people, work with feedback, and take responsibility for delivery. XP helped them do this.

People in the agile world still talk a lot about Scrum, lean, kanban, etc., just like we used to 10 years ago. However I feel the tone of a lot of conversations has changed – now a lot of times it’s just about the process. Agile seems to no longer be about people changing their attitude to projects, to delivery or most importantly to people. Now often it’s just about introducing a new team management methodology in the hope that Lean or XP or whatever will be a process magic bullet that will solve all their problems.

But with process, as with many other things, there is no magic bullet.

Process is very important. It’s where the rubber of any methodology hits the road, but there are problems with an overly-zealous focus on process:

  1. Processes can become kings. Processes are at the end of the day just tools – they have no intrinsic value by themselves. They can only be judged valuable in the context of their application. When processes become kings then our discussions descend to hypothetical judgments of a supposed intrinsic value that the processes don’t have. Such discussions are a waste of time and don’t help anyone.
  2. If processes are considered axiomatic then we can no longer adapt how we work. If we believe the best way to do something is X, yet we do not understand the motivation for X, how can we decide if Y would be better?
  3. It misses the the point of what Agile was supposed to be about…

What I think is important about Agile

The Agile Manifesto starts as follows:

We are uncovering better ways of developing software by doing it and helping others do it. Through this work we have come to value:
Individuals and interactions over processes and tools…

In other words the unique characteristics, personalities and abilities of members of a software development team, and the conversations that they have with each other and with their stakeholders (management, users, etc.) are worth considering more than the process and tools that they use.

This is not to say at all that processes and tools are unimportant, I am merely arguing that that they are not so important that they should drive discussion about the basic way we choose to deliver software.

Focussing on individuals sounds like a management technique. While that is important, I think it is more a call to individuals themselves to consider how they can most effectively be part of a software delivery effort.

There are many ways individuals can answer this call, but one way that may be useful is to look at the values and principles from ‘Extreme Programming Explained‘ – Kent Beck’s original book about XP. These values and principals are not specific descriptions of how to do something but guides to help people decide how they actually want to work. I’m not going to go into detail here since literature is already available for each point (and I also discuss further in my talk at around the 13 minute mark), but a list of ideas is as follows:

Values

  • Simplicity
  • Communication
  • Feedback
  • Courage

Principles

  • Assume the values are present
  • Incremental change
  • Deliver at the earliest responsible time (an addition / variant of mine that I think is worth considering separately)
  • Quality (the Jim Highsmith article I refer to in the talk is here)
  • Accepted responsibility - everyone on the team should assume they have responsibility
  • Local adaptation – change everything according to context

How this applies to practice

These values and principles are all somewhat theoretical – the application of them is the actual choice of what set of  practices and processes a team choose to use. Not one pre-defined overall process, but an active, continuing choice of what techniques to use.

This is necessary since in the case of software development teams and projects, there are no such things as best practices. There are practices that teams find useful within their own context, but this is not an absolute categorization.

How I’ve recently embraced this

In the video I describe how I applied these ideas on my most recent project. It starts at about the 28 minute mark. It made sense to include this in the talk but I’m going to leave detail out of this post.

It is worth mentioning here though that there are certain ‘typical practices’ of agile that we did use but others that we chose not to. For example we didn’t use ‘iterations’ to structure our week-to-week work. However we did often deploy new software, we frequently re-prioritized our next work, etc. Since we already did these things formal iterations in our world would have been unnecessary baggage. For many other teams formal iterations would be very valuable.

Is this really for everyone?

In discussing this subject some people have challenged me that this way of thinking is only useful for ‘experts’, such as people who already have previous agile experience. I disagree. While I think that picking an off-the-shelf methodology might make for a ‘decent first set’ of practices I think a team needs to know from the beginning some amount of why that may be so. I think for someone with experience to provide a pre-canned process set as ‘the way things should be’ is disingenuous.

I wouldn’t expect everyone on a team new to agile to be able to immediately make their own choices about the entire implementation of principle to practice, but I would expect them to know that the introspection of their process (based on values and principles), and their subsequent refinement of the same, is a more important aspect of agile development than any of the individual techniques they may be using.

Concluding thoughts

None of this is new at all, and a lot of the good agile literature from the last decade describes these ideas. As a more recent example Ross Pettit does a good job talking about them here.

I think it’s worthwhile repeating it though for 2 reasons:

  • I see some amount of the agile community as a whole moving to a ‘process first’ mindset and I disagree with it.
  • I’ve seen myself at times throughout my career treating process, practice or technique as dogmatic. Invariably when I do this it’s because I’ve missed something important. Stepping back and thinking ‘why’ always leads to improvement. I think this is a valuable reminder to myself and hopefully others.

Leaving DRW, and my take on Customer Affinity

Last month I finished working at DRW Trading after nearly 4 years there. DRW has a fantastic technical organization on both the Software Engineering (SE) and IT Operations side, from the leaders (CTO Lyle Hayhurst, COO Seth Thomson and Director of SE Derek Groothuis) down. In many ways I expect this will be one of the best jobs I ever have – my technical colleagues were fantastic (especially Jay Fields, my right hand man for the last 18 months), my team had complete management and implementation control of our projects, I didn’t have to deal with much in the way of politics at all, and yes, the pay was good!

So the obvious question is why leave? As usual in such cases it’s not a simple answer. I’m going to go on something of a tangent before I give a couple of reasons.

With software development jobs where the goal is producing a product, that product is not an end in itself for the customer [1]. The product is a tool that will be used by the customer to do something. Compare this with doctors or actors – in the first making people healthier is the sole end goal, in the second entertaining is the sole end goal [2].

(Good) software developers have some understanding about what the thing they’re making is going to be used for. Compare this time with structural engineers responsible for building a bridge – they know they’re building a bridge but they’ve no idea what the final destination is of the people traveling over that bridge.

I think most software development roles in some ways resemble being a lawyer (hear me out!) As a lawyer the principle aim, in the context of litigation at least, is to win the case, but you’re always doing so for a particular type of case. You might be a criminal lawyer, patent attorney, etc.

Most lawyers I know are interested not just in the practice of law itself, but also to some extent in their more specific field. A non-profit housing lawyer (to use an example very close to me, thank you Sara) might not just be interested in winning cases, but in helping less privileged tenants in having a fair voice against landlords with far more means.

And so we come back to software. There’s no doubt that as ‘software people’ we are interested in making stuff. We all have parts of ‘making’ we’re more interested in, whether it be the technical design, the project process, the user interface, etc. But I would describe these all as second-order goals.

The first-order goal is the actual thing that will be useful to the customer. Again, I’ll say what I did above – good developers have an understanding about the first-order goal (in other words they have what Martin Fowler calls Customer Affinity.) Excellent developers for the long term have not just an understanding, but an active interest, of the first-order goal (otherwise known in our field as the domain.)

For most of the time I was at DRW the second-order problems I was solving were fascinating to me. With a very small team we built and maintained a large, solid, well-appreciated application. Furthermore I was making good inroads into understanding the domain (commodities trading). Once we’d solved a lot of the second-order problems though what remained was understanding, and appreciating, the domain.

The problem was that I don’t find trading very interesting. To me it’s ‘just’ math and moving money around. Trading in some ways has a lot in common with gambling : assessing financial risk and making positions (‘bets’) based on what you currently see in the slice of the world around you. I’ve never been particularly interested in gambling and I think the 2 are linked. I know many excellent developers who are truly interested in trading (Jay being one of them) – I don’t have a problem with them, I just don’t share their enthusiasm for this particular domain.

Further though, I think to be an excellent developer in the long term not only must you appreciate / have passion for what the users of the software want to do with the software, you also need to have empathy for the users themselves. Martin says this at the end of the first paragraph in the link above : “[Customer Affinity] is the interest and closeness that the developers have in the business problem that the software is addressing, and in the people who live in that business world.” (emphasis mine).

This leads to a second reason I left DRW – I didn’t empathize with many of the traders I worked with. Note that I’m absolutely not saying I thought they were wrong, they’re certainly far more financially successful than I am at least, what I’m saying is that my approach to work and their’s didn’t meld in the general sense. I don’t think it’s useful to get too specific here, and probably this is something you wouldn’t know either way until you’ve been in trading for some time, but I think there’s a general lesson worth taking from this beyond just trading.

One thing I’m happy about is that I realized these things before they started to make an impact on my work. Being able to leave knowing you’ve done your best, but that you wouldn’t be doing your best if you stayed, is a very satisfying position to be in.

The reasons I give for my leaving sound negative, but I’m picking specifically the reasons I left, not the many, many reasons I stayed for nearly 4 years. Of the financial services companies I’ve worked with I enjoyed being at DRW by far the most, I don’t regret my time there at all, and I absolutely appreciate the opportunity I had (and am grateful to the leadership of DRW for giving it to me.) For developers wanting to join, or continue in, the trading industry I would recommend joining DRW wholeheartedly.

Of course, there’s an obvious postscript here. What am I doing now? I’m taking time off! I’ve never taken extended leave in my life, not ever since school, and I’m fortunate to be able to do so now. I have a few ideas of what I might do and I look forward to updating this blog as some of them (and others!) come to fruition.

[1] There are other types of software development jobs, e.g. programming language research, training, and ‘pure’ consultancy (as opposed to ‘body shopping’). The difference with consulting is that you’re not building a product – the end goal is to help other people build a product. I would even consider going back into finance as a ‘pure’ consultant.
[2] Maybe I’m over simplifying here, but I don’t think I’m too far off the mark