what is nostalgia?
nos·tal·gia/nɒˈstældʒə, -dʒiə, nə-/
1.a wistful desire to return in thought or in fact to a former time in one's life, to one's home or homeland, or to one's family and friends;
a sentimental yearning for the happiness of a former place or time: a nostalgia for his college days.
2.something that elicits or displays nostalgia.
1770–80; Neo-Latin, from Greek nóst (os) a return home + -algia
Dictionary.com, "nostalgia," in Dictionary.com Unabridged. Source location: Random House, Inc. http://dictionary.reference.com/browse/nostalgia. Accessed: July 21, 2011.
forced random sampling
Daniel Cornel, Robert F. Tobler, Hiroyuki Sakai, Christian Luksch, Michael Wimmer
Forced Random Sampling: fast generation of importance-guided blue-noise samples,
In The Visual Computer, 33(6), pages 833-843, 2017.
Abstract: In computer graphics, stochastic sampling is frequently used to efficiently approximate complex functions and integrals. The error of approximation can be reduced by distributing samples according to an importance function, but cannot be eliminated completely. To avoid visible artifacts, sample distributions are sought to be random, but spatially uniform, which is called blue-noise sampling. The generation of unbiased, importance-guided blue-noise samples is expensive and not feasible for real-time applications. Sampling algorithms for these applications focus on runtime performance at the cost of having weak blue-noise properties. Blue-noise distributions have also been proposed for digital halftoning in the form of precomputed dither matrices. Ordered dithering with such matrices allows to distribute dots with blue-noise properties according to a grayscale image. By the nature of ordered dithering, this process can be parallelized easily. We introduce a novel sampling method called forced random sampling that is based on forced random dithering, a variant of ordered dithering with blue noise. By shifting the main computational effort into the generation of a precomputed dither matrix, our sampling method runs efficiently on GPUs and allows real-time importance sampling with blue noise for a finite number of samples. We demonstrate the quality of our method in two different rendering applications.
Link: Publisher's version
Download: BibTeX citation
This is the author's version of the work. The final publication is available at Springer via http://dx.doi.org/10.1007/s00371-017-1392-7.
analysis of forced random sampling
Analysis of Forced Random Sampling,
Master's thesis, TU Wien, Institute of Computer Graphics and Algorithms, Vienna, Austria, pages 1-85, 2014.
Abstract: Stochastic sampling is an indispensable tool in computer graphics which allows approximating complex functions and integrals in finite time. Applications which rely on stochastic sampling include ray tracing, remeshing, stippling and texture synthesis. In order to cover the sample domain evenly and without regular patterns, the sample distribution has to guarantee spatial uniformity without regularity and is said to have blue-noise properties. Additionally, the samples need to be distributed according to an importance function such that the sample distribution satisfies a given sampling probability density function globally while being well distributed locally. The generation of optimal blue-noise sample distributions is expensive, which is why a lot of effort has been devoted to finding fast approximate blue-noise sampling algorithms. Most of these algorithms, however, are either not applicable in real time or have weak blue-noise properties.
Forced Random Sampling is a novel algorithm for real-time importance sampling. Samples are generated by thresholding a precomputed dither matrix with the importance function. By the design of the matrix, the sample points show desirable local distribution properties and are adapted to the given importance. In this thesis, an efficient and parallelizable implementation of this algorithm is proposed and analyzed regarding its sample distribution quality and runtime performance. The results are compared to both the qualitative optimum of blue-noise sampling and the state of the art of real-time importance sampling, which is Hierarchical Sample Warping. With this comparison, it is investigated whether Forced Random Sampling is competitive with current sampling algorithms.
The analysis of sample distributions includes several discrepancy measures and the sample density to evaluate their spatial properties as well as Fourier and differential domain analyses to evaluate their spectral properties. With these established methods, it is shown that Forced Random Sampling generates samples with approximate blue-noise properties in real time. Compared to the state of the art, the proposed algorithm is able to generate samples of higher quality with less computational effort and is therefore a valid alternative to current importance sampling algorithms.
Download: PDF file (8.4 MB)
composite flow maps
Daniel Cornel, Artem Konev, Bernhard Sadransky, Zsolt Horváth, Andrea Brambilla, Ivan Viola, Jürgen Waser
Composite Flow Maps,
In Computer Graphics Forum (Proceedings EuroVis 2016), 35(3), pages 461-470, 2016.
Abstract: Flow maps are widely used to provide an overview of geospatial transportation data. Existing solutions lack the support for the interactive exploration of multiple flow components at once. Flow components are given by different materials being transported, different flow directions, or by the need for comparing alternative scenarios. In this paper, we combine flows as individual ribbons in one composite flow map. The presented approach can handle an arbitrary number of sources and sinks. To avoid visual clutter, we simplify our flow maps based on a force-driven algorithm, accounting for restrictions with respect to application semantics. The goal is to preserve important characteristics of the geospatial context. This feature also enables us to highlight relevant spatial information on top of the flow map such as traffic conditions or accessibility. The flow map is computed on the basis of flows between zones. We describe a method for auto-deriving zones from geospatial data according to application requirements. We demonstrate the method in real-world applications, including transportation logistics, evacuation procedures, and water simulation. Our results are evaluated with experts from corresponding fields.
Download: PDF file (15.0 MB)
Download: BibTeX citation
Download: EuroVis 2016 PowerPoint slides (203 MB)
visualization of object-centered vulnerability to possible flood hazards
Daniel Cornel, Artem Konev, Bernhard Sadransky, Zsolt Horváth, Eduard Gröller, Jürgen Waser
Visualization of Object-Centered Vulnerability to Possible Flood Hazards,
In Computer Graphics Forum (Proceedings EuroVis 2015), 34(3), pages 331-341, 2015. Best Paper Award, 3rd place
Abstract: As flood events tend to happen more frequently, there is a growing demand for understanding the vulnerability of infrastructure to flood-related hazards. Such demand exists both for flood management personnel and the general public. Modern software tools are capable of generating uncertainty-aware flood predictions. However, the information addressing individual objects is incomplete, scattered, and hard to extract. In this paper, we address vulnerability to flood-related hazards focusing on a specific building. Our approach is based on the automatic extraction of relevant information from a large collection of pre-simulated flooding events, called a scenario pool. From this pool, we generate uncertainty-aware visualizations conveying the vulnerability of the building of interest to different kinds of flooding events. On the one hand, we display the adverse effects of the disaster on a detailed level, ranging from damage inflicted on the building facades or cellars to the accessibility of the important infrastructure in the vicinity. On the other hand, we provide visual indications of the events to which the building of interest is vulnerable in particular. Our visual encodings are displayed in the context of urban 3D renderings to establish an intuitive relation between geospatial and abstract information. We combine all the visualizations in a lightweight interface that enables the user to study the impacts and vulnerabilities of interest and explore the scenarios of choice. We evaluate our solution with experts involved in flood management and public communication.
Download: PDF file (14.3 MB)
Download: BibTeX citation
Download: EuroVis 2015 PowerPoint slides (55.7 MB)
The lighthouse real-time demo was made during a practical course about real-time rendering at the Vienna University of Technology in 2011. The task was to create a harmonious OpenGL 3.x demo with respect to content and visual appearance which uses a few common real-time effects.
It is written in C#, using OpenGL 4.0 to support displacement mapping via the new tessellation shaders. Other effects include bloom, depth peeling, irradiance and environment cube mapping, parallax normal mapping, post-process volumetric light scattering, projected grid water with perlin noise heightfield waves, and shadow mapping with percentage closer filtering.
The original music was taken from the Black Mesa Source theme.
Download: Binaries for Windows (23.9 MB)
Note: This demo requires an OpenGL 4.0 compatible video card (NVIDIA GeForce 400 / ATI Radeon HD 5000 series or better) for the tessellation and will not run otherwise. Also, the demo was written for NVIDIA systems and might not work properly with ATI/AMD video cards.
sop - setoptimusprofile
NVIDIA Optimus is an energy-saving technology that allows automatic seamless switching between an integrated and a discrete high-performant GPU in mobile devices. Unfortunately, the switching decision is currently based* on a profile system in which application profiles have to be created either by NVIDIA (pre-defined profiles for popular applications such as video games) or by the user.
* Informations regarding the switching system are sparse. According to the NVIDIA Optimus whitepaper, the discrete NVIDIA GPU is triggered by DX, DXVA and CUDA calls, but no OpenGL calls. From my experience, however, CUDA calls do not trigger the switching mechanism.
For a developer on an Optimus system, this is rather annoying since an own application profile has to be created for each GPU-intensive application. Even more annoying, every application user with an Optimus system has to create a profile for the application, too, or else the application will use the integrated GPU, which might be inperformant or might not work at all because of missing hardware features.
SOP is a simple workaround for developers based on NVIDIA's NvAPI that creates an application profile for the current application such that it is always started using the discrete GPU. If the profile already exists, the application is bound to the existing profile. Multiple applications can be gathered in one generic profile, thus keeping the profile system neat and tidy. The profile itself can also be edited in the NVIDIA Control Center.
staircase-aware smoothing of medical surface meshes
This is an implementation staircase-aware mesh smoothing as proposed in the paper "Staircase-Aware Smoothing of Medical Surface Meshes" by Tobias Mönch, Simon Adler, and Bernhard Preim.
For further information, please visit this site. There you'll find a detailed description of the application, implementation and parameter details, as well as download links for Windows binaries, sources, and documentation.
texture virtualization for terrain rendering
Antelope Island Utah - Visualising Gigapixel Texture Datasets Using Virtual Texturing.
Developed by the Multimedia Lab of Ghent University. Image retrieved from here.
This is a state of the art report written in April 2012 on the concept of texture virtualization, including both Clipmaps and Virtual Texturing. A modern, comprehensive Virtual Texturing system for real-time applications is presented among popular and important acceleration techniques, recent developments and promising fields of application which have not been covered in the main contributions on the matter.
Abstract: Virtual texturing is a technique that allows the use of arbitrarily large textures within the limited physical video memory. Through a paging and streaming system, only the currently visible parts of a mipmap chain are stored in the video memory while the rest of the data may reside in any other memory or storage device. Not only does this enable the use of unique and very detailed textures, but makes high resolution images such as satellite or aerial photography data usable in real-time applications without further modifications or downsampling.
This work sketches the virtual texturing pipeline and discusses the benefits and limitations of it. Due to the nature of terrains in real-time applications, the discussed methods are of particular importance for performant and photorealistic terrain rendering and are thus viewed with regard to these properties and needs. Special emphasis is devoted to recent developments in virtual texturing and possible future fields of application as well as acceleration techniques.
Download: PDF file (2.3 MB)
assimp 3.0 opengl demo with bone animation
"Open Asset Import Library (short name: Assimp) is a portable Open Source library to import various well-known 3D model formats in a uniform manner. The most recent version also knows how to export 3d files and is therefore suitable as general-purpose 3D model converter. [...] Assimp aims at providing a full asset conversion pipeline for use in game engines / realtime rendering systems of any kind - but is not limited to this audience. In the past, it has been used in a wide and diverse range of applications."
As the accompanying tool AssimpView uses the Direct3D API, there was no sample code available for the usage of Assimp with OpenGL for quite some time. Especially the bone animation is a cumbersome task, which is why I decided to create and provide a basic bone animation demo for OpenGL. The demo is written in C++ and uses SDL for context handling and GLEW for extension handling. Both libraries are included in the download link below.
Note: This demo was written for NVIDIA systems and might not work properly with ATI/AMD video cards.
integration of 3d characters into a 2d adventure engine
Since winter 2010, I have been developing and maintaining a library for the adventure game engine Visionaire to integrate 3D characters into the 2D game scenes. The main objective is to integrate the 3D models in such a way that they behave like the already existing 2D characters in the game logic and can be replaced with each other. Also, backward compatibility, efficient rendering, and simple interfaces for the user are crucial design criteria. Besides the basic rendering, the library also offers shadow rendering, toon shading, and ambient occlusion, which can be adjusted as needed to seamlessly integrate the 3D character into the e.g. hand-drawn or pre-rendered scene.
The theoretical background of the implementation as well as a general survey over 2.5D games is discussed in my Bachelor Thesis at the Vienna University of Technology, which is available here (28.0 MB).
animated transitions in statistical data graphics
This program was made during a practical course about information visualization at the Vienna University of Technology in 2011. It implements the basic idea proposed in the paper Animated Transitions in Statistical Data Graphics (Heer and Robertson, 2007).
For further information, please visit this site. There you'll find a short summary of the paper, implementation and dataset details, as well as download links for Windows binaries, sources, and documentation.
Apefruit is an OpenGL 3.2 game that I have done together with a colleague during a practical course at the Vienna University of Technology in 2010. The gameplay is a crossover of football and the game mode "bombing run" of the Unreal Tournament game series, except that the players are monkeys and the football is a giant banana.
The game was written in C++ and the shader language GLSL using GLFW for context handling, GLEW for extension handling, GLM as mathematics library, ODE for collision detection, and FMOD Ex for audio playback. All libraries are included in the download link below.
Download: Binaries for Windows (22.2 MB)
Note: This game was written for NVIDIA systems and might not work properly with ATI/AMD video cards.
contact and imprint
This website is maintained by
Having a question or trouble with some code? Have you already asked the oracle about it? If no, please do so now. Still have questions?
Carrier pigeon: No longer accepted due to repeated abuse.
drivenbynostalgia.com is a private, non-commercial website. All content on this website is published under the right to freedom of speech as declared in the Universal Declaration of Human Rights and is to be interpreted as my opinion at a certain point of time or the opinion of a third-party if denoted. Hence I do not claim correctness or harmlessness of any of the content provided on this website.
This website contains links to third-party websites of which content I cannot control or can be held liable for. I added these links at a certain time when the linked websites did not seem to contain any offensive or illegal content. I do not permanently check all linked websites for availability and content changes, so please inform me if a link has to be changed or removed.
All compiled and source code files provided on this website are provided as is and without any guarantee of executability, safety or security. They are published in a publicly available private archive and are to be interpreted as coding suggestions and should not be used without an examination of the code - as should nothing you download from the internet. Several applications depend on third-party libraries which might be included in the downloads. Please consider the respective licences and copyright regulations.
If not denoted otherwise in the code files itself, all of my code published on this website is published under the BSD 3-clause license!