18
SvPablo Evaluation Report Hans Sherburne, Adam Leko UPC Group HCS Research Laboratory University of Florida Color encoding key: Blue: Information Red: Negative note Green: Positive note

SvPablo Evaluation Report Hans Sherburne, Adam Leko UPC Group HCS Research Laboratory University of Florida Color encoding key: Blue: Information Red:

Embed Size (px)

Citation preview

Page 1: SvPablo Evaluation Report Hans Sherburne, Adam Leko UPC Group HCS Research Laboratory University of Florida Color encoding key: Blue: Information Red:

SvPablo Evaluation Report

Hans Sherburne,Adam LekoUPC Group

HCS Research LaboratoryUniversity of Florida

Color encoding key:

Blue: Information

Red: Negative note

Green: Positive note

Page 2: SvPablo Evaluation Report Hans Sherburne, Adam Leko UPC Group HCS Research Laboratory University of Florida Color encoding key: Blue: Information Red:

2

Basic Information Name: SvPablo Developer: University of Illinois Current versions:

SvPablo 6.0 SDDF component 5.5 Trace Library component 5.1.4

Website: http://www.renci.unc.edu/Software/Pablo/pablo.htm

Contact: ? Email address on website/documentation are no good

Page 3: SvPablo Evaluation Report Hans Sherburne, Adam Leko UPC Group HCS Research Laboratory University of Florida Color encoding key: Blue: Information Red:

3

Introduction SvPablo

Part of Pablo Project at UIUC Last website update: June, 2004 Website now hosted at www.renci.unc.edu Project appears to be inactive at this point

GUI for source code correlation of performance data Instrumentation of source code

Automatic Interactive

Display of performance data Color coded indicators beside procedure listings and source code Popup dialogs display more detailed information Scalability graph

Page 4: SvPablo Evaluation Report Hans Sherburne, Adam Leko UPC Group HCS Research Laboratory University of Florida Color encoding key: Blue: Information Red:

4

Instrumentation/Visualization Process In SvPablo

Figure 2: instrumentation/visualization process in SvPablo (c/o [1])

Page 5: SvPablo Evaluation Report Hans Sherburne, Adam Leko UPC Group HCS Research Laboratory University of Florida Color encoding key: Blue: Information Red:

5

Performance Data in SvPablo What can be instrumented?

Function calls Outer Loops

What type of performance data is avalable? Procedure Statistics

Statistics describing all instrumented calls to a procedure Call Statistics

Statistics describing a specific procedure callsite Loop Statistics

Statistics describing an outer loop Statistics provided

Exclusive and inclusive duration Count Max, min, mean, std dev. HW Counters (if available)

Scalability Run program with same instrumentation configuration using varying number of nodes SvPablo will calculate and graph parallel efficiency for each instrumented procedure and loop

Page 6: SvPablo Evaluation Report Hans Sherburne, Adam Leko UPC Group HCS Research Laboratory University of Florida Color encoding key: Blue: Information Red:

6

Main Window in SvPablo

Figure 1: Main window in SvPablo

Page 7: SvPablo Evaluation Report Hans Sherburne, Adam Leko UPC Group HCS Research Laboratory University of Florida Color encoding key: Blue: Information Red:

7

Self-Defining Data Format

Self-Defining Data Format (SDDF)is the format used to store performance data in SvPablo Provides a flexible structure

Allows multi-language analysis support Possible to add new metrics easily

Performance File Entries Configuration records

Specifies GUI display information for event, and procedure statistics Event statistics records

In SvPablo events are routine calls, loops, or hardware counter values Define records, and give value for routine callsites, and loops

Procedure statistics records Define records, and give value for procedures

Event locality records Links events and procedures

Page 8: SvPablo Evaluation Report Hans Sherburne, Adam Leko UPC Group HCS Research Laboratory University of Florida Color encoding key: Blue: Information Red:

8

Data Capture Library

Offers a set of functions which can be used to instrument code Start/End Data Capture Function Entry/Exit Loop Start/End OpenMP applications

Threaded Function Entry/Exit Threaded Loop Entry/Exit

Page 9: SvPablo Evaluation Report Hans Sherburne, Adam Leko UPC Group HCS Research Laboratory University of Florida Color encoding key: Blue: Information Red:

9

Errors Abound in SvPablo Parser

Does not understand c++ style comments Variables not declared at top of function Once a source file is loaded that cannot be parsed by SvPablo, the application

must be restarted to view correlated data for any source code! GUI

Some lines in camel appear instrumentable, but are not Instrumenting these lines causes runtime errors

Routines correctly correlated in Main window, not correctly correlated in “Procedure Statistics” window

Compiling Instrumented code Problems with “void” return types in some code Some instrumented source code output by SvPablo could not be compiled

Executing instrumented code Errors are generated for functions ending with “exit(0)”

Scalability graph Have to change screen resolution to see entire graph

Page 10: SvPablo Evaluation Report Hans Sherburne, Adam Leko UPC Group HCS Research Laboratory University of Florida Color encoding key: Blue: Information Red:

10

SvPablo – Overhead All programs executed correctly when instrumented Benchmarks marked with a star had high variability in execution time

Readings with stars probably not accurate Instrumenting a large number of loops creates high overhead!

Page 11: SvPablo Evaluation Report Hans Sherburne, Adam Leko UPC Group HCS Research Laboratory University of Florida Color encoding key: Blue: Information Red:

11

Evaluation (1) Available metrics: 2/5

Can use PAPI and MIPS R10000 hardware counters (not evaluated) Other statistics based on loop and function call count and duration are provided No statistics regarding communication are provided

Cost: 5/5 SvPablo is freely available

Documentation quality: 3/5 Documentation covers all available features, however figure placement and

arduous terminology impede quick understanding Documentation does describe how one might extend the tool

Extendibility: 3/5 SvPablo source code is freely available May be more of a hassle than helpful due to large number of bugs Project appears to be inactive, so we’d likely be on our own for support

Filtering and aggregation: 3/5 Only hardware counter values and statistical data are recorded,

Page 12: SvPablo Evaluation Report Hans Sherburne, Adam Leko UPC Group HCS Research Laboratory University of Florida Color encoding key: Blue: Information Red:

12

Evaluation (2) Hardware support: 4/5

Sun Solaris, SGI IRIX, IBM SP, Compaq Alpha, NEC SX6 Linux workstations Heterogeneity support: 0/5 (not supported) Installation: 3.5/5

Installation on Linux platform required a number of manual configurations, but was not bad overall

Interoperability: 3/5 SvPablo uses SDDF file format The source code for the SDDF component is freely available, and documented A SDDF to XML conversion component is also freely available

Learning curve: 3.5/5 The interface is fairly intuitive, but takes some use to get comfortable The terminology and project file hierarchy are a bit cumbersome

Manual overhead: 3/5 It is fairly straightforward to instrument all loops and routines It is necessary to manually select only important loops in order to keep overhead

low Measurement accuracy: 3/5

Tracing all loops increased overhead substantially in Camel

Page 13: SvPablo Evaluation Report Hans Sherburne, Adam Leko UPC Group HCS Research Laboratory University of Florida Color encoding key: Blue: Information Red:

13

Evaluation (3) Multiple analyses: 1/5

Scalability analysis view is only means of analysis provided Multiple executions: 3.5/5

SvPablo includes a view to select performance data from multiple runs and view a graph plotting efficiency for each instrumented routine

Multiple views: 2/5 A limited number of views are available Only profile data (not trace data) is viewable

Performance bottleneck identification: 2.5/5 Scalability view shows methods with poor parallel efficiency Routines can be sorted based on max exclusive duration

Profiling/tracing support: 1.5/5 Only profiling is supported Profiling is done on routine, and loop execution metrics Communication profiling is not available

Page 14: SvPablo Evaluation Report Hans Sherburne, Adam Leko UPC Group HCS Research Laboratory University of Florida Color encoding key: Blue: Information Red:

14

Evaluation (4) Response time: 2/5

Data is not available in SvPablo until after execution completes and performance data is processed

Software support: 3/5 MPI profiling library allows linking against different MPI implementations C and Fortran are fully supported, PGI HPF is partially supported (no selective

instrumentation) Source code correlation: 4/5

Source code correlation of profile data is the main view offered System stability: 2.5/5

SvPablo is very finicky about C syntax (once source code is loaded with syntax that cannot be understood, program must be restarted!)

On occasion SvPablo seg faults for unknown reasons Technical support: 0/5

Email addresses listed in documentation and on webpage are bad

Page 15: SvPablo Evaluation Report Hans Sherburne, Adam Leko UPC Group HCS Research Laboratory University of Florida Color encoding key: Blue: Information Red:

15

Bottleneck Identification: Performance Tool Test Suite: CAMEL, LU Testing metric: what did profile data tell us? CAMEL: FAIL

Not possible to profile a section of code that is not a loop or function call Not possible to represent actual dynamic behavior (no trace) Required a lot of effort to clean up syntax

Can’t have: C++ “//” style comments Variable initialization after beginning of function

NAS LU: NOT TESTED Unable to succesfully instrument code

Segmenation fault, when opening init_comm.f Instrumenting lu.f alone causes execution errors

Page 16: SvPablo Evaluation Report Hans Sherburne, Adam Leko UPC Group HCS Research Laboratory University of Florida Color encoding key: Blue: Information Red:

16

Bottleneck Identification: Performance Tool Test Suite: PPerfMark Big message: PASSED

Profile showed large amount of time spent in Send and Receive

Diffuse procedure: PASSED Profile showed large amount of time spent in

bottleneck procedure, even though time is diffused across processes

Hot procedure: PASSED Profile showed large amount of time spent in

bottleneck procedure Intensive server: TOSS-UP

Profile showed large amount of time spent in Recieve, and in waste_time()

It would take a lot of reasoning to figure out the two are related

Ping pong: TOSS-UP Profile indicates lots of time spent in Receive,

and in Send Does not show communication pattern between

two processes

Random barrier: TOSS-UP Profile shows lots of time spent in Barrier, and

waste_time() Profile does not show communication pattern

amongst processes Small messages: TOSS-UP

Profile shows lots of time spent in Send and Receive

Very high standard deviation, difference between Max/Min

Profile does not show communication pattern amongst processes

System time: TOSS-UP Profile show lots of time spent in kill(), and

getpid() No distinction is made between user and

system calls Wrong way: TOSS-UP

Profile show lots of time spent in Send and Receive

Profile does not show communication pattern amongst processes

Page 17: SvPablo Evaluation Report Hans Sherburne, Adam Leko UPC Group HCS Research Laboratory University of Florida Color encoding key: Blue: Information Red:

17

Conclusions

The use of a GUI for interactive code instrumentation simplifies the process for the user

The source code is available, and fairly well documented

Extension of this tool is a possibility Risky due to no support Lots of errors experienced!

Page 18: SvPablo Evaluation Report Hans Sherburne, Adam Leko UPC Group HCS Research Laboratory University of Florida Color encoding key: Blue: Information Red:

18

References

1. SvPablo User’s Guide ftp://www.renci.unc.edu/pub/Pablo.Release.5/Documentation/SvPabloGuide.ps.gz