So it’s been a few weeks since this happened, and I finally got around to talking about this having processed it all.

I’ve left Walt Disney Imagineering since early June.

The whole thing came as a surprise since I was actually expecting to be put on a new project, based on what I’ve heard from management at the end of February when I asked what I was going to be doing Shanghai.

The official wording is that my contract has been completed and my role no longer is needed. While technically true, I did wish that there was a little bit more warning.

I’ve been asked whether I hate Disney now that I’m no longer working for them; I still think that working for Disney, particularly at WDI, was one of the best jobs I’ve ever worked at. The things I’ve learned there will assuredly be of use for the rest of my life; Working at WDI taught me the value of ideation, and the skills and mindset that allows for that to occur.

Things are never as bad as they seem. I took a week off after my time with Disney was over and started looking at other opportunities, not just for work, but what I want to be doing next. I started planning for things that I never had time for while I was working at WDI. It’s been refreshing being able to fully commit myself to whatever I wanted to do. Having played victim to the “I-wish-I-had-more-time-but-work-is-in-the-way” syndrome, now I really have no excuse.

The day I left Disney became the day when I started to hold myself fully accountable. It’s a scary and confusing road, but one that I’m okay with going down.

Learning Blender – Week 2

After learning the basics of blender the previous week, I started delving into some of the really cool aspects of the blender; Rigging and animation. This was a particularly long section that involved numerous new concepts, such as creating an armature to build a relationship between the different limbs of an object, in this case, a lamp:

A lamp consisting of a base, a stem, 2 arms, and a lamp-shade.

The lamp is designed with different pieces in mind so that when it comes time to animate, we can utilize Blender’s IK (Inverse Kinematic) feature to automatically pose for us.

Lamp, with a timeline below to show motion

The section also explored the shader models, which allowed for complex representations of an object. In this case, a gem was added including a glass-like, translucent property.

Using the node editor to create complex shaders
Using the node editor to create complex shaders

All of these things together create a scene that was photo-realistic, with the correct lighting and reflections. This was achieved using a method called Ray-Tracing, which is computationally very expensive, but produced very good results.

Rendered Scene. Note the reflections of the light on the plastic base of the lamp, as well as the diffusion of the light through the back of the gem and the mirrored look of the front.

Creating that frame took my GTX960 GPU about 2.5 minutes to compute. Combining the animation feature with the render enables someone to make short films using this method. I created a 6 Second Video of Lamp and Gem , which ended up taking about 7 hours for the computer to create (I put the computer to work overnight to crank this out). It is possible for it to be faster using multiple computers or a better GPU, but for something simple like this, it’s good enough. It was a joy to see the finished product as I woke up the next day!

There’s definitely a lot of fun to be had with 3D modeling using Blender. Although the tool took a little to get used to, once I understood the quirks I was able to make things happen very easily. I highly recommend that anyone who is interested in 3D modeling to take the course that I took (Search Ben Tristem and Blender on Udemy).


First steps in VR

Lately, I’ve been putting alot of time and research into Virtual Reality (VR) technologies. My first experience with VR was at a showcase a few months back, where I was able to demo a few different experiences at various price ranges. Of the experiences I had, the Vive stood out to me as the most developed solution at this time.

I was able to try the Vive with an application called Realities.IO, which is an app that sends users to various real world locations. After a short spiel by one of the developers, I don the headset and transported into space, with the earth in the center. I was able to walk around the earth and pan it around using the Vive’s hand controllers; Once I’ve found a location I liked, I was instructed to zoom in.



The location was at a Mayan temple in South America. The scene faded into view and I was instantly awed; I was able to walk short distances around the temple, and was also able to teleport short distances to places beyond the reach of the room I’m in. I couldn’t get over how real it all looked; Coming from software being in a flat surface of a monitor, this experience was able to encompass in my entire view, and my movements matched the movements on my ethereal character, standing on the steps of a Mayan temple I may never visit in my life. At one point, I kneeled down to get a closer look at some rockwork, and without thinking, reached out to touch it; A short moment of immersion that was broken because that rock was never there in the first place, but it seemed real enough to at least try.

Where I was; At least, where I was transported to in VR

The experience was over as quickly as it had began; The headset and headphones were taken off, and I was back in the conference hall, familiar with the murmurs and chatter of the space. But for that short moment in time, I could’ve convinced myself that I was somewhere else entirely.