Since the company’s inception, the Runway team and their community have explored the many ways wherein ML can impact the creative process.
By Sarah Catanzaro
I’ve always loved Philippe Starck, the French industrial designer known for chairs inspired by mid-century masters. So I was eagerly anticipating my coffee meeting on a brisk September day in 2018 with Philipp Schmitt, who like Starck, was inspired to apply advanced mathematical and computational concepts, to design chairs. I was awed as Philipp described how he reversed the role of human and machine in the design process by using GANs to generate visual prompts that creatively inspired human designers. While I knew that ML could automate routine tasks like object selection and background removal, I had not imagined just how profoundly AI could impact the creative process.
However, I left the meeting dismayed after Philipp described the challenges he encountered in using Tensorflow and other frameworks designed for ML research. Nonetheless, my hope was renewed a few weeks later when Philipp introduced me to Cristobal Valenzuela, who had recently completed his research at NYU’s Interactive Telecommunications Program (ITP) on a new application to make ML accessible to creatives like Philipp and others with less technical expertise.
A few weeks later, Cris was guiding me through the halls of ITP, explaining how students were using new technologies ranging from 3D printing to AR/VR for new forms of representation and expression.
Shortly thereafter, we invested in Runway; funding Cris and his co-founders, Alejandro Matamala and Anastasis Germanidis to transform the creative process. The team was committed to not only streamlining design work with AI, but also enabling creatives to perceive and conceive in radically different ways.
Now 2 years later, we are investing in Runway again, and leading Runway’s $8.5M Series A funding round. Since that time, Runway has proven that they can enable artists, musicians, designers, and filmmakers throughout the world to use ML. They’ve empowered designers at New Balance to apply generative models to craft new sneakers. They’ve empowered the Grammy-winning dance-pop band YACHT to use language models to write songs on their new album Chain Tripping. Their users are creating modern art installations, board games, tattoos, and textiles.
Since the company’s inception, the Runway team and their community have explored the many ways wherein ML can impact the creative process. Their Series A funding will enable them to encode these learnings in a next-gen creative toolkit, which we believe will have an impact as profound as that of the camera. The first release towards this vision is Green Screen, which enables video creators to perform professional video rotoscoping in seconds.
To democratize access to ML, you must deeply empathize with the user who you serve. To enable new forms of invention and iteration, you must understand the creative process. Cris, Alejandro, and Anastasis are brilliant technologists, but they’re also artists – who have exhibited at contemporary art centers and art festivals throughout the world. They not only grasp their users workflows and requirements, they also understand their complex motivations and intentions. They make subtle decisions that align with user expectations. We believe their user empathy and authenticity has enabled them to attract a massive community who have uploaded over 24 million assets and run close to 900,000 models. It will enable them to continue to expand and work with the community as they redefine art.
For centuries, the creative community has had a profound impact on culture and society. We expect that by leveraging Runway’s platform to make impossible things, this impact will be even deeper. We’re honored to play a part in such an important company.