DerStig and Tuna, at the end of the day it's all moot anyway. Any video you shoot is to be communicated to others. Almost no one has enough bandwidth to stream 4K. Netflix for example requires a steady internet connection speed of 25Mbps to do it. The reality is that true 4K streaming can’t take place at even 12-15Mbps unless there is a 40% efficiency in encoding going from H.264 to HEVC and the content is 24/30 fps, not 60 fps. Once a true 60Mbps 4K video from a high quality camera is down scanned and re-encoded so it can be streamed to the masses who cares what it came from. Most of that great quality is lost unless all you plan on doing is viewing your own MP4s on your own PC.
To quote a portion of a Wired article -
"Even after the 4K streams have been optimized at the source, they’ll still require at least two to three times the bandwidth you’d need today to watch a 1080p HD feed. This is a problem that the industry can only solve by reorganizing its infrastructure, something that requires not only a significant capital investment, but also a lot of time. It will likely take years for cable providers to provision enough bandwidth on existing systems to deliver 4K video from the networks."
"Pulling in a 4K signal over the air should also be possible, but it will take years if it happens at all. First, major networks will need to decide to broadcast content in 4K and upgrade their equipment. Then they’ll need to get on the same page regarding next-generation broadcasting technologies."