Photographer and video camera enthusiast, known as mr_ekan on Reddit, has recently attracted significant attention by showcasing an impressive Python script that lets one simulate realistic camera movement in Blender.
Originally created to address a challenge in the creator's own ongoing project, the setup utilizes SensorLog and, similar to other projects that involve manipulating the 3D space with real-life gadgets, leverages a smartphone's gyroscopes. As for recording the captured data, the setup also includes an additional script that converts the raw IMU data into keyframes. "The app I'm using has a recording functionality which then can be exported and converted through this script," the author noted.
The best part is that the author plans to share the script as a GitHub repo by the end of the week, allowing you to experiment with it or use it in your Blender projects. The final version is set to utilize acceleration data from the iPhone's IMU to calculate translation movement if needed. "I'm thinking of building out a full plugin interface for this with a ton of features," commented the creator. "Right now it's very much hacky but will be packaged well soon." As such, we highly encourage you to follow mr_ekan on Reddit and Instagram so as not to miss the upcoming release.