Developing our processes has always been a constant challenge in the past years, hence in 2017 we decided to set up a dedicated team to take on this task. So, basically three people constantly work on developing cutting edge solutions such as our custom VR services or archviz-related programming. Just to give you an idea: this year 10% of our budget goes to this team. That’s a lot, but we believe it is an important investment in order to provide ever higher-quality archviz products and services. To sum it up: Brick has been a very experimental company since the beginning; always challenging the know-how and the tool set has enabled us to improve so quickly, and this mindset is basically the engine of the company.
In addition, the team also works as a research lab where new products and solutions are forged. Team members: Zsolt Simon (Unity, programming), Péter Sárhidai (programming), Mihály Paseczki (vfx, programing), Cselovszki Attila (CDO).
No surprise, our most important subject in 2017 has been VR. At the moment we are focusing on the development of RealtimeVR and Panoramic360 products and our goal is to improve our pipeline.
Seeing the mixed reality tutorials we wanted to move one step further and take advantage of Vive’s potential. We got the idea of somehow getting away with the tracking process on the actual footage, and started working out a solution.
Elements of the system:
– Live connection between Tracker and 3DS Max. Tracker data can be imported to 3DS Max through Unity application.
– Live keyed image from camera to a control monitor
– The camera in 3DS Max is synchronized with the actual physical camera
– Creating log file about the spatial coordinates of the camera.
If these work then:
– We are able to render background footage with matching camera without the frustrating tracking process.
– We are capable of instructing the real camera in the studio, depending on the virtual environment.
– Without the need of tracking marks, the keying process also becomes much easier.
The link between the tracker and 3DS Max was implemented by a program that can translate between Unity and Max. Merging the synchronized image from the camera and the max was also done on a uniquely developed Unity application.
The following video gives you an insight into the steps of this experiment:
And here is the final result:
Later on we will show you what tools we are working on for our pipeline development, such as the post production plugin for panoramic 360 images or 3DS Max Scene management.
Stay tuned …