Joseph Nelson, co-founder, Roboflow
Computer vision, generally speaking, sets out to process visual information in much the same way humans do with their eyes. It’s a rapidly expanding field within the realm of artificial intelligence (AI), and two guys in Iowa are using it as a transformational technology to revolutionize everyday life.
Joseph Nelson, along with his business partner Brad Dwyer, started Roboflow to help others take advantage of the potential of computer vision. Flyover Future spoke with Nelson about what they’re doing.
Give us some examples of computer vision.
Nelson: It could be a Nest camera that sends an alert when a package arrives on your doorstep or a check-scanning app on your mobile device for making a deposit to your bank. An insurance company may have a bunch of photos over the top of a city after a storm and need an automated understanding of whether any of the roofs on the homes have faced damage.
These are examples of computers understanding images intelligently. That’s the core of computer vision.
How did your company begin?
Nelson: We started in the summer of 2019. At the time, we were building our own computer vision consumer applications, creating things that made board games more fun to play. We released a Boggle solver [4x4 word game] in an app called BoardBoss and continued to add features for other board games.
We added the capability to understand chess boards and provide move recommendations. Playing chess implicitly relies on an application understanding the state of the board, meaning where each piece is [located] and its position relative to other pieces.
As we continued to build more of these applications, it became clear that the tooling was really underdeveloped. We ended up building tools – and it felt like reinventing the wheel – that make it easier for organizing the images and teaching machines and training models and deploying those models. In November 2019, we decided to take our internal tools and develop them to be externally facing. In January 2020, we released Roboflow as a development toolset, and targeted the product to developers who want to use computer vision in their own use cases.
How were you funded?
Nelson: Initially, my co-founder and I self-funded the business. We were fortunate early on in that we didn’t have to get any outside funding. We did receive external funding in the summer of 2020. We joined Y Combinator that summer, which wrote a $120,000 check. At the tail end of that, we did raise a seed round of about $2.6 million across a range of investors.
What is your background that got you into this field?
Nelson: I have long since been at the intersection of new technology and providing access to that new technology. My last company was also a machine learning company. We provided a product for U.S. congressional offices to sort through all of their inbound mail. In a sense, U.S. Congress is like the customer support center for our democracy: Congress gets 40 million messages a year, 92% of which are digital.
Before that, I worked at Facebook on the government and policies team, working on data products. I also worked at a series of smaller companies, one of which was General Assembly, which provides education programs to those who are trying to transition into technology.
I’ve been in the space of enabling people to take advantage of technology, especially machine learning, for a long time.