Ukraine Photogrammetry

Fuck Putin, Слава Україні! 🇺🇦 ✊ 🌻

Those who don't study history are doomed to repeat it, and we can only study what we record. I've been experimenting with creating 3D reconstructions of scenes from Russia's unprovoked invasion of Ukraine using videos from twitter/telegram/tiktok and youtube. I'm making these available here as CC BY-SA, please DM me on twitter or email me thot@thiic.cc if you have any questions or have footage to use for more reconstructions. Source code is available on github

Reconstructed Scenes

These may take a moment to load up, they are 10-20MB each. VR viewing should be available as well with everything scaled true to life. If anyone has information about the original sources of these videos or higher quality versions please contact me.

BMP-2 Wreckage

3D viewer
3d Recreated from this video bmp2

Hind crashed in a field

3D viewer
3d Recreated from this video hind

Hind crashed on a hillside

3D viewer
3d Recreated from this video hind_hill

KA-52 crashed in a field

3D viewer
3d Recreated from this video ka52

Procedure

The individual steps here have been covered pretty well elsewhere on the internet but here are some notes to make it easier if you want to attempt something similar. I'll try to keep this updated with anything else I discover.

1.Split video to images ffmpeg makes this easy

ffmpeg -i crashed_hind.mp4 -ss 7.5 -to 30 -r 5/1 h%03d.png

-ss 7.5 -to 30 means process from 7.5 sec to 30 sec
-r 5/1 sets the framerate to 5fps
h%03d.png write to *.png sequence like h001.png, h002.png, . . .

For some videos it's necessary to remove certain sections or process a section with higher speed motion at a higher temporal resolution and it's good to be aware of that as a potential remedy to lost tracking.

2. Reconstruct scene

I use meshroom for this (good overview of meshroom usage), In general I stick with the default processing graph with with a couple exceptions. As a second-to-last step I'll usually add a MeshDecimate node to tame the mesh density a bit since the noisy data from compressed cell-phone videos can't really hope to represent that much detail anyway. On the final step I use LSCM for UV generation in order to make sure I have just one texture atlas for the final mesh. Finally as a last resort when things aren't working I'll turn on generating and matching based on the AKAZE feature descriptors in addition to SIFT, in my experience this is mostly a shot in the dark and rarely helps tangibly.

I'm curious if commercial offerings would do a better job of this so if anyone has a spare license/credits for any of those please get in touch.

3. Clean and prepare mesh

Once I've got a mesh I use blender try and scale the scene as best I can to use real world units using a combination of publicly available vehicle blueprints and guesswork. If the mesh needs cleaning up I'll also do that. Once things look good I'll export everything as a *.glb which can be directly imported into three.js and generally seems like a decently well supported interchange format. The viewer is just a few lines of JS and pretty much all of the heavy lifting is done by three.

Does Hubble's Law mean the James Webb can see the past in color?


Short answer? Yes. Even though the James Web Space Telescope (JWST) is an infrared telescope, it's main camera, the NIRCam will effectively be able to take visible-light images of things between 7 and 13 billion years old.

For those unfamiliar, here is a diagram of the spectrum visible to us humans compared with the NIRCam on the Webb. As you can see there's barely any overlap.

visible light

However, because of Hubble's Law and the effects of redshift, visible light gets redder when it's ancient. Light from stars and galaxies that are at least 7 billion years old gets stretched out enough that it starts to fall entirely within the Webb's range.

7bya light

That shift continues as things get older and older, and we run out of sensitivity once whatever we're looking at is about 13billion years old.

13bya light

Curious as to how this all works? Luckily we just need a few points of cosmological groundwork to tie it all together.

First, if you look at a star 5 light years away from earth, you're actually seeing that star as it was 5 years ago because the light took 5 years to get to you.

Second, the universe is expanding, everything is always getting further away from everything else. This means that on very large scales the further away you are from something the faster it is moving away from you. Roughly speaking this is Hubble's Law. This together with the previous point means that the age of the light that you see from a star, it's speed away from you, and it's age are all tied together.

Finally, we need to understand redshift, which is probably best done by analogy.

So you know how when a train is rushing past you and it honks it sounds like "nnneeeyyaouwwwwwww" instead of the simple "nnaaaaaaaaaaaaa" you'd hear when it's standin still? That's because the sound waves get squished when it's coming at you and stretched out when it's rolling away. We hear the pitch get higher when the waves are squished, and lower when they're stretched. The same thing happens with light, but we percieve that change in "pitch" as a shift in color. You can think of blue light as being high pitched, and red as low pitched. As something moves faster and faster away from you, the "pitch" of it's light gets lower and lower and so the color gets redder and redder. This is called redshift. The opposite happens if something is moving toward you resulting in a blueshift.

So to bring this back around, what Hubble's Law is actually saying is that the older something it is, the more redshifted it will appear. With a little math we can see that visible light from beyond a certain age is redshifted enough to be visible to the JWST's infrared seeing eyes.

test post, please ignore

nothing to see here tyvm