Comments Off on Vicarious

Creating a unified algorithmic architecture to achieve human-level intelligence in vision, language, and motor control

Machines have enabled us to exceed our native abilities in myriad ways, but modern software achieves this by making narrow, often hard-coded assumptions about the problem domain it solves. Even the most advanced AIs of today are limited to driving cars or playing games such as Jeopardy by following a narrow set of hand-defined protocols, rather than having the ability to learn and reason like humans.


To this end, Vicarious is working to create software that is capable of the same kind of flexible thinking and dynamic learning as mammals. The company’s first project is a vision system that will serve as the backbone for machines that have human-level reasoning. It is an ambitious goal, but one Vicarious is well designed for.

[blockquote quote=”If you’re coming from academia, you are familiar with the 9-18 month grant to publication cycle. Too often, the best way to get publishable results is to take last year’s top model, make some minor tweaks that improve accuracy on a standard dataset, and then rush to submit a paper. This cycle is not conducive to the kind of continual re-thinking needed to make fundamental advances. In industry, product orientation tends to breed mismatched incentives and politics that are toxic for long-term research. We started Vicarious to create a place where focused AI research could happen without friction.” source=”Scott Phoenix – Founder” reverse=”false”]

The company has raised almost $70 million to free themselves from the constraints of grant applications, publications or product development cycles. Vicarious’ unique position enables its team to approach artificial intelligence with the time and patience they would not otherwise be afforded in academia or industry. They have also assembled an impressive cross-disciplinary team of neuroscientists, programmers and machine-learning specialists that combine decades of expertise in the field.