since some time I am contemplating about how to implement better
i.e. more instructive images of our star, the sun
There are these great images from SOHO, at 6 different wavelengths at a given time of observation, altogether a
wealth of amazing information...
Have a look how great they are (just 3 wave lengths):
Recently, I had this idea of exploiting mipmaps
purpose! Mipmaps are always generated and stored along with the main
texture in DXT (dds) formatted textures! What does that imply?
Take a 4k solar texture, for example. Along with the main 4k texture,
the dds texture would then store a sequence of 12 mipmaps, each being
a factor 2 smaller in (linear) size than the previous one. The point is that
depending on the distance of observation of that texture, lower and
lower resolution mipmaps are smoothly blended
in during the
display. After all, at large distance from the sun, we cannot
distinguish all those fine details anymore!! So we save resources this
My idea is based on the fact that e.g. with the NVIDIA command line tools
one may assemble custom made mipmap images into one dds
This could be exploited as follows:
Take some usual low-res standard solar texture, just with a few sun spots for
the large distance mipmaps, but incorporate mipmaps from SOHO for
short-distance hi-res display! The blending from low-res to hi-res
SOHO stuff would then be done automatically.
Of course I am always getting back conceptionally to my other proposal
of incorporating Celestia virtual "instruments" through a set of filters for
different light wavelengths. The SOHO data would be great for this
since with a virtual "filter wheel" one eventually might tune through
the various data at different wave lengths (taken at the same time!)...