Slicing: 3D texture mappingshfang/cs552/cs552-tf-vector.pdf · 2014. 3. 2. · 1 1 Slicing: 3D...

Preview:

Citation preview

1

Slicing: 3D texture mapping

Store volume in solid (3D) texture memory For all k screen parallel image planes in

distance lk– intersect slicing plane lk and trilinearly

interpolate f using 3D texture mapping (need to compute 3D texture coordinates for the vertices of the slice polygon).

– blend the texture mapped slice into frame buffer (using back-to-front alpha blending).

Hardware implemented 3D texture mapping

2

Slice based Rendering

coloropacity

object (color, opacity) Similar to ray-casting with simultaneous rays1.0

3

Rendering by Slicing - examples

MRI 2563 (back-to-front blending):

4

Visualization– Rendering

• Object-based (region-based)

• Fast– Data filtering

• Color lookup table• 3D Image processing

Virtual environment– Display– Interaction

The 3DIVE System

5

CAVE– 8 ft. Cubed room– Front, left, right and

floor rear-projection– Flock of Birds magnetic

tracking

ImmersaDesk– 4 x 5 ft. Screen– Single rear-projected

display– Ascension magnetic

tracking

Virtual Environment

6

Rendering Method: 3D texture mapping

7

8

9

Transfer functions

A function mapping from scalar values (and maybe their gradients or other evaluated quantities) to color and opacity values

May involve a sequence of scalar-to-scalar mappings followed by a “coloring” process (color lookup table, shading, etc.)

Red Green Blue Alpha

10

Transfer function

Surface rendering Semi-transparentrendering -------

intensityColor & opacity

11

Transfer function design

Infinite search space --- search space reduction, avoid invalid and bad transfer functions

Visual result dependent --- interactive process

Optimizing transfer function parameters by Genetic Algorithms

Design galleries Image analysis Integrated image processing

12

TF: Parameter optimization

Dataset Parameter generation Volume rendering

User evaluation Image population

Dataset Parameter generation Volume rendering

Automatic evaluation Image populationUser objectives

a:

b:

13

TF: Parameter optimization (2)

1. Encoding the solution2. Generate initial population3. Evaluate initial solution and assign fitness to

each solution4. While no satisfactory solution is found

4.1 Stochastically select an intermediate population4.2 Generate new solution population4.3 Evaluate new solutions4.4 Load-balance the population

14

TF: Parameter optimization (3)

Solution encoding– Normalized functions: [0,1] [0,1]– A solution: Xi = [s1,s2,s3, …, sn] (i.e. samples)

Initial solutions (population)– Random– User defined– Pre-defined simple math functions

Selection of intermediate population– Genetic algorithm– Proportionate selection based on fitness values# of offsprings of solution i = fi / f, where fi is the fitness value

of solution I and f is the average fitness value in a population.

15

TF: Generation of new solutions

Mutation: For each solution X = [s1,s2,s3, …, sn], a new solution may be generated: Y = [t1,t2,t3, …, tn], where ti

is a mutation of si. For example:

Crossover:– Randomly pair solutions– For each pair, randomly select two points, and

exchange segments between the two points.

llyexponentia decreases andconstant a is random, is ]1,1[ ,

m

mmmii

dfdfst −∈+=

16

TF: Solution evaluation

– Generate an image by volume rendering using each transfer function solution

– User selection (1: like; 0: don’t like; (0,1): somewhere in between)

– Automatic evaluation: objective function by analyzing the images (e.g. histogram analysis)

17

Parameter optimization (examples)

18

TF: Design galleries

Parameter optimization: narrowing down solutions Design galleries: solution dispersion Design principles

– Input vector : piecewise function (polylines)– Output vector : a selected set of pixels from each

rendering image– Dispersion : finding a set of parameters of the input

vectors that optimally disperse the output vectors (by measuring nearest neighbor distances)

– Arrangement: organizing resulting images for easy selection and browsing.

19

20

TF: Design galleries (2)

Input: a random set of input vectors, I, and their output vectors, O. |I|=|O|=n

Output: modified input & output vectors, I and OProcedure Disperse(I,O,t) {

for i = 1 to t do {j=ran_int (1,n);u=perturb(I[j], i); // new transfer functionmap(u,v); // generate output vector “v”k=worst_index (O); // with the smallest nearest distanceif (is_better (v, O[k], O)) // replace O[k] with v

I[k]=u; O[k]=v;else if (is_better (v, O[j], O) // replace O[j]

I[j] = u; O[j] = v;}

}

21

TF: Image analysis

Kindlmann and Durkin(IEEE VolVis Symposium 98)

Looking for boundary features Edge-detection based Transform dataset to a histogram volume Study f-f’-f’’ relationship

22

23

24

25

26

27

28

Position function Defining opacity function based on distances to

the surface – constant thickness. Assuming a Gaussian distribution across boundary

boundary along average boundary along average :

)()(

)())((

))((

)()( ,)(

))(()( , ),( :

''

2

1'

1''2

2'

''2

212

2

fh fgwhere

vpvg

vhvff

vffx

xxfxfeccxf

vpbxbx p(v)xfvLet

'

x

=

=

=−

−≈−

−≈⋅+≈

====

σσσ

α

σ

29

30

31

32

TF: Integrated image processing

Fang, Biddlecome, Tuceryan, IEEE Vis’98 Integrating image processing and visualization: a

more general approach Representing a transfer function as a sequence of

image processing procedures, with intuitive parameterization.

F = fn fn-1 …… f2 f1

where fi is an intensity mapping defined over the volume space, representing the result of an image processing procedure.

Coloring: shading and color table

33

Two basic types of intensity mappings

Intensity table: an intensity-to-intensity lookup table representing a piecewise linear function over the volume’s intensity domain: [0,1][0,1]

Neighborhood function: a function involving the intensity values in a mxmxm neighborhood of the voxel: D [0,1]

A typical form is the 3D spatial convolution over the volume V and a mask T:

f (x,y,z) = Σ Τ [i, j, k] V [x+i, y+j, z+k]i,j,k = -m/2

m/2

34

V0

Parameter modification Coloring and rendering

point pointf1 f2 f3

V0

Parameter modification Coloring and rendering

f1 f2 f3

V0

Parameter modification Coloring and rendering

slice slicef1 f2 f3

V1 V2 V3

(a) point-basedapproach

(Ray-casting)

(b) volume-basedapproach

(3D texture mapping)

(c) slice-basedapproach

(2D texture mapping)

35

In point-based approach, when multiple neighborhood functions are used in one transfer function, each voxel may be computed multiple times, since it may fall into the neighborhoods of several sampling points.

36

Buffering in point-based approach

A small fraction (often less then 10%) of the total set of voxels are actually used for each rendering

Buffers can be used to avoid repeated computation

V0 point f1 f2 f3

Buffer 1 Buffer 2 Buffer 3

37

Enhancement operations

Point enhancement– Intensity modification– Histogram modification (e.g. histogram

equalization)Spatial enhancement

– Smoothing– Sharpening

38

Smoothing & Sharpening

Smoothing– Gaussian:

– Median filter: median value in a neighborhood

Sharpening– Laplacian filter:– Unsharp masking: blending the low frequency

component V1 and the high frequency component (V-V1) using a convolution mask.

2222 2/)(22

1),,( σ

πσkjiekjiT ++−=

)(),,( 11 VVVkjiT −+= γ

),,(),,(),,( 2 zyxgzyxgzyxf ∇−=

39

40

41

Iso-surface rendering by boundary detection

Apply a boundary (edge) detection operator to identify all boundary voxels (e.g. gradient thresholding)

Generate histogram of boundary voxels

Extract the intensity values (iso-values) at which the histogram reaches local maxima.

42

43

44

Dynamic boundary rendering

Dynamically determine (by edge detection) the boundary points during rendering.

For surfaces that cannot be well defined by iso-values (e.g. in microscopy, photobleaching causes the same material to have different intensities in different focal planes).

Only simple edge detection procedure are used (e.g. convolution based)

45

Multi-scale iso-value detection

46

[Witkin, 1983] : “SCALE-SPACE FILTERING”.

Describes signals qualitatively, managing the ambiguity of scale in an organized and natural way.

The signal is expanded by convolution with Gaussian masks over a continuum of sizes.

The “Scale-Space” image is then collapsed, using it’s qualitative structure (e.g. zero-crossing points), into a tree providing a concise but complete qualitative description covering all scales of observation.

Scale space smoothing

Scale space map

s

σ

47

48

Vector Visualization

Data set is given by vectors: Gaseous and fluid flow (car, ship and aircraft design, blood vessels)

Techniques :– Hedgehogs/glyphs– Particle tracing– stream-, streak-, time- & path-lines, stream-

ribbon, stream-surfaces, stream-polygons, stream-tube

steady and unsteady flows: vector field stays constant or changes with time

49

50

Mappings - Hedgehogs, Glyphs

Put “icons” at certain places in the flow: oriented lines, glyphs, vortex, etc.

Use of icon size (length, volume, area) and direction

Tend to clutter the image real quick

orientedlines

glyphs

vortex

Examples

51

Weather Data

52Direction-hue Direction-value

Tornado

53

54

Mappings – Warping

Warping: Animate displacement by deformation and distortion

55

Mappings – Displacement Plot

Displacement Plot: Use scalar values nvs ⋅=

56

Mappings - Path-lines

Lines from particle trace collection of particle traces gives sense of

time evolution of flow computed by

( ) ( )dttxvxdtxvdtxd ,or , ==

57

Path-line tracing

Euler method: Runge-Kutta method:

))( 21 terror: O(tvxx iii ∆∆+=+

),( ),,(

))( ) (2

1

311

ttvtxvvtxvv

terror: O(vvtxx

iiiii

iiii

∆+⋅∆+==

∆+∆

+=

+

++

58

Time-lines

Position at an instant of time of a batch of particles which had been released simultaneously.

59

T = 1 T = 2 T = 3

timeline

60

TimelinePathline

61

Mappings - Streak-lines

Locus at time t0 of all fluid elements that have previously passed through x0

information of the past history of the flow obtained by linking all the end-points of

the trace of:

Computer particle positions from the origin at time

( )txvdtxd ,=

itt ⋅∆−0

62

Mappings - Streak-lines

63

Mappings - Streamlines

Everywhere tangent to the flow, a mathematical curve, exist only at fixed time t0

Same as path-lines & streak-lines in steady flow integral curve along a curve s (s is the arc-length

of the curve) :

( ) ∫== vdsxtxvdsxd , , 0

64

Mappings - Streamlines

65

66

Mappings - compare

67

Mappings - Contours

68

Mappings - Stream-ribbon

Need to see vorticities, I.e. places where the flow twists -- requires surface information.

Idea: trace neighboring particles and connect them with polygons, then shade those polygons appropriately to show twists.

A Problem - flow divergence ( the “spread”) Solution: trace one streamline and a constant size

vector (curve’s normal vector) with it:

69

Mappings - Stream-tube

Stream-tube: Generate a stream-line and connect circular crossflow sections along the stream-line

70

71

72

Mappings - Stream-surface

Stream-surface: Collection of stream-lines passing through a base curve (rake).

If the rake is closed : stream tube If the rake is open and short: stream

ribbon. No flow can pass through a stream surface Constructed by connecting polygons.

73

74

Mappings - Flow Volumes

Instead of tracing a line - trace a small polyhedron

75

Flow Volume (1) Seed polygon (square) is used as smoke generator. Center is perpendicular to flow. Square can be subdivided into finer mesh. Volume is adaptively subdivided in areas of high

divergence (e.g. when edges become too long) There is no

merging. Irregular volume

(various topologies)

76

Flow Volume (2)

Can simulate puffing smoke

Can be color-coded to represent other fields.

77

78

Rendering

Hedgehog & Glyphs– Oriented lines – polygonal representation

stream-**:– Curves– polygonal models– volumetric models (flow

volumes)

hedgehog

79

Image Processing

apply a vector field to an image to create motions

Recommended