My current role allows to me to engineer the AI learning libraries that are used at SonyAI to train game AI. These were used to create the AI GT Sophy, which beat the best players in the world in the racing simulator Grand Turismo.
You can learn more about this system in the published Nature Paper.
I lead the design of a backend framework to connect and coordinate user requests as live commands to an autonomous delivery robot. It utilized various cloud technologies to connect a live, autonomous robots with users via a smart phone app. This was part of project SMADS which was centered around robot autonomy in an urban environment.
I’ve worked on navigation automation stack for indoor environments for a number of entities and platforms. This includes Universities and companies such as Diligent Robotics and Svenzva Robotics.
Visual Music started as a commissioned art project which turned into a Web3 app. Originally started as a physical box to display music, not as a synthesized visualizer but as a visualization of the sound itself.
The box and the electronics within are completely custom. It produces a laser display using high-baud galvonometers using sound as direct input.
The web app VisualMusic takes the same concept as the box, but displays it in a browser. However, much is planned for the web app, including a live, interactive component with the physical laser box. The app uses blockchain to give artists ownership over tracks that they can develop specifically for display.
This is a project that was both an exploration in front end frameworks (specifically Vue.js) as well as an excuse to delve into Web3.