Upload
thierry-gayet
View
842
Download
3
Embed Size (px)
DESCRIPTION
From gcc to the autotools
Citation preview
From GNU/GCC to the autotools
Thierry GAYET
2
PLAN
1. Development’s environment
2. Build process on gnu/linux
3. Tools needs for the development
4. Static and dynamic libraries
5. Manual build
6. Building with GNU/make :
7. Building with the Autotools :
1. Automatic build using the autotools
2. Using the autotools
3. Pragmatic example in action
8. Gnu/make/autotools vs tmake
9. Conclusion
1. Development’s environement
4
Workstation
06/17/10
On a workstation, eveything is on the local rootfs :• The binutils : ar, ldd, strip, …• The toolchain : gcc, system, libraries (libc, libm, libpthread, …), headers, …• The debugger : gdb
Only one architecture is in use at one time (x86, x86-64, …)so we can only compile for it.
http://www.pathname.com/fhs/
5
Cross-compilation for an embedded target
06/17/10
SDK (stagingdir) :Temporally rootfs dedicated to
The target arch
If several should be used, several SDK should be used too.
Toolchain :• gcc • headers• host tools• …
Local rootfs
Gestion de version
Intermediateinstallation
Used for compilation (headers), and links (libraries)
Real rootfs for theEmbedded target
Real embeddedtarget
FirmwaresFlashing
NFSBOOT
Final installation(on the rootfs)
2. Build process on gnu/linux
7
Summary
06/17/10
Source code.c / .cpp
Source code.c / .cpp
Headers.h
Headers.h
Preprocessinggcc -E
Linkingld
Libraririe dynamique .soLibraririe dynamique .so
Compilinggcc -c
Librairie statique .aLibrairie statique .a
Objetcs .oObjetcs .o
.o
Output
ELFObject
ELFObject
Assembleras
8
C preprocessing
06/17/10
• Lexical preprocessors ;• Operate on the source text ;• Prior to any parsing by performing simple substitution of
tokenized character sequences ;• Typically perform macro substitution by inlining and
templates, textual inclusion of other files, and conditional compilation or inclusion ;
• takes lines beginning with '#' as directives.
Examples of usage for using the C preprocessor on a C file :
$ gcc -E file.c
this is the same command used for generating an intermediate object but the –o paramter is not given)
9
Assembler
06/17/10
If the –S parameter is given, stop after the stage of compilation proper; do not assemble. The output is in the form of an assembler code file for each non-assembler input file specified.
By default, the assembler file name for a source file is made by replacing the suffix .c, .i, etc., with .s.
Input files that don't require compilation are ignored.
Examples for having the assembly code of a .c source code :
$ gcc -S file.c
http://homepage.fudan.edu.cn/~euler/gcc_asm/
10
Compiling : the frontend of the compiler
06/17/10
• Parsing of the source code through a lexical/semantic analyzer (such as gnu/flex) ;
• Build an internal representation of the program ;• In a first step it generates non-optimized intermediate code ;• In a second step it generates optimized intermediate code (if
required) ;• The intermediate code is adapted to the target architecture.
Examples for generating an intermediate object :
$ gcc -c file.c -o file.o$ file file.ofile.o: ELF 32-bit LSB relocatable, Intel 80386, version 1 (SYSV), not stripped
GCC = GNU Compiler collection
GCC
C JAVAFortran PascalASM...
11 06/17/10
Linking : the backend of the compiler• Generates assembly code for the final output ;• Can generates either ELF object (replace the old a.out one) ;• Uses ld through the gcc ;• Can make:
Binary : adds a main entry to the intermediate code dynamic library : if the -shared parameter have been given
(cannot be runnable except for the libc)
Direct usage with the ld linker : $ ld -o mybinary /lib/crt0.o file.o –lc
crt0 (or crt0.o, gcrt0.o, mcrt0.o) is a set of execution startup routines (usually part of the C standard library) that are platform-dependent, and is required in order to compile using GCC and other GNU tools. crt stands for "C runtime".
-e entryUse entry as the explicit symbol for beginning execution of your program, rather than the default entry point (main). If there is no symbol named entry, the linker will try to parse entry as a number, and use that as the entry address (the number will be interpreted in base 10; you may use a leading 0x for base 16, or a leading 0 for base 8).
3. Tools needs for the development
The toolchain, the heart of the development process
A toolchain is the set of programming tools that are used to create a product (typically another computer program or system of programs). The tools may be used in a chain, so that the output of each tool becomes the input for the next, but the term is used widely to refer to any set of linked development tools.
A simple software development toolchain consists of a text editor for editing source code, a compiler and linker to transform the source code into an executable program, libraries to provide interfaces to the operating system, and a debugger. A complex product such as a video game needs tools for preparing sound effects, music, textures, 3-dimensional models, and animations, and further tools for combining these resources into the finished product.
On a workstation, the toolchain is made by all the tools used for compilation (gcc, gdb, ...). It includes all the headers, native libraries that can be used for the compilation and the linking.
Beside, a toolchain is mostly used for cross-compiling code for another target. In order to use the right one, a prefix is used (sh4-linux-gcc, … ) ; it include the binutils inside.
The binutils
The GNU Binutils are a collection of binary tools. The main ones are:
* ld - the GNU linker. * as - the GNU assembler.
But they also include:
* addr2line - Converts addresses into filenames and line numbers. * ar - A utility for creating, modifying and extracting from archives. * gprof - Displays profiling information. * nm - Lists symbols from object files. * objcopy - Copys and translates object files. * objdump - Displays information from object files. * ranlib - Generates an index to the contents of an archive. * readelf - Displays information from any ELF format object file. * size - Lists the section sizes of an object or archive file. * strings - Lists printable strings from files. * strip - Discards symbols.
For a cross-compilation usage a prefix is add before the name of the binutils (sh4-linux-ar, sh4-linux-nm, sh4-linux-strip, .... )
4. Static and dynamic libraries
16 06/17/10
Summary of the differences
• Static libraries: Integrate all symbols in
one binary The nm command display
internal only buildin functions
• Dynamic libraries : Uses external symbols
through dynamic libraries The nm command show
the internal (T) or external (U)
symbols ; The ldd command display
the libraries dependencies to the binary.
BINARYSTATIC
LIB BINARYBINARY DYNAMICLIB
DYNAMICLIB
17 06/17/10
Static libraries : creation and extraction
Creating a static library is an easy task thank to the ar tool from the binutils :
$ ar –rv mystaticlibrary.a file1.o file2.o file3.oor$ ar –rv mystaticlibrary.a *.o
The previous command merge the three objects (test1.o, test2.o et test3.o) in one archive.
A static library is:• a tank for ELF objects • similar to an archive such as zip or tar• have a .a extension
A static library is not linked but is an archive make by the ar binutil. Thus, This is possible extract it the buildin objects.
Extracting the content of a static library use the same tool: $ ar x mystaticlibrary.a
Some tools such as Midnight commanger (mc) can browse into.
For more information: man ar
18 06/17/10
Static libraries : testsIl est aussi possible de lister le contenu des fichiers objets inclus dans la librairie :
$ ar –t mystaticlibrarycominter.ocom_util.ofiloint er.omiscell.oparal.opilot.osimul_api.ospyinter.ouserint.o
La commande suivante permet de lister la liste des symboles par objets :
$ nm mystaticlibrary.a ad_server.o:
00000099 T affichage_etat_client00000004 C bDebugAd00000004 C bDebugSu
U bTrace00000390 T close_socket
cominter.o:00000004 d bComInit00000000 d bNoInitWarn00000010 b bTrComInter
U : undefined (implémentation externe)T : implémenté à l’intérieur « Pour un objetcela liste les fonctions et pour une librairie les fonctions par objet . »
19 06/17/10
Static libraries : linkingA link with a static library will bring all the functions used within the source code inside the final library.
The final symbol table will be a merge of the function used inside the source code and the functions from the static library:
$ gcc test.o mystaticlibrary.a –o test
Because a static library is like an archive that contain ELF objects, a link with a static library is similar to a link with other ELF objects:
$ gcc –c file.o mystaticlibrary.a –o testor$ gcc –c file.o file1.o file2.o –o test
with libtest.a that contain two files file1.o and file2.o ; this is often use static library because only the symbols in use are include in the final binary.
test.o libtest.a
20 06/17/10
Dynamic libraries : creationCreating a dynamic library need to link al lthe objects into one ELF object. We don’t needany other binutil, but the ld linker itself through gcc :
$ gcc -Wall -fPIC -c test1.c –o test1.o$ gcc -Wall -fPIC -c test2.c -0 test2.o$ gcc -shared -Wl,-soname,libtest.so.1 -o libtest.so.1.0 test1.o test2.o
Listing of parameters that can be given to the compiler
Those libraries have a .so extension but they are associated with a version (Minor and Major). Pour plus d’information : man ld
Compiler options Definitions
-Wall include all warnings. See man page for warnings specified.
-fPIC Compiler directive to output position independent code, a characteristic required by shared libraries. Also see -fpic.
-shared Produce a shared object which can then be linked with other objects to form an executable.
-W1 Pass options to linker. In this example the options to be passed on to the linker are: ”-soname libctest.so.1”. The name passed with the ”-o” option is passed to gcc.
-o Output of operation. In this case the name of the shared object to be output will be libctest.so.1.0
21 06/17/10
Dynamic libraries : dependencies and runtime
Once a binary is build this is possible to get a lising of its dynamic dependencies
$ ldd mybinary
libanasm7.so => /home/tgayet/vittam2/lib.i386_linux/libanasm7.so (0x40017000)
libc.so.6 => /lib/tls/libc.so.6 (0x42000000)/lib/ld-linux.so.2 => /lib/ld-linux.so.2 (0x40000000)
This command say with which libraries the binary is link ; it give also the path of the libraries.
The nm tools can also be use in order to have the symboles (function build inside). The library shouldn’t not be stripped.
At the runtime, a binary link with a dynamic library, will check the local cache through the ldconfig tool. Previously the configuration was made with the /etc/ld.so.conf but it have been replaced by the /etc/ld.so.conf.d/ (or /etc/ld.conf.d/).
Regerating the cache : $ sudo ldconfig Dumping the cache : $ sudo ldconfig -p
A dynamic library is link with a binary only during the runtime, so by modifing it, we can override through the LD_LIBRARY_PATH environment variable:
$ export LD_LIBRARY_PATH=./my_lib_path $ ./mybinary
22 06/17/10
Dynamic libraries : linkingIn order to link a dynamic library with a biry we can specify some information to the ld linker:
$ gcc -fPIC test.o –L./libpath –ltest
This command say that the test.o source code that will make the final binary will be dynamically link with the libtest.so library. The « lib » prefix should not be specify because it is automaticaly added (For libtest.so, only test should be specify to the –l parameter).
The –l<libname> parameter ask to the linker to search a library called libname. By default, a dynamic (libname.so) one is used except is the –static parameter is provide.
Linker option Description
-l Provide a libname (eg: for libtest, only –ltest is specify)
-L Provide a path or a set for the libraries
DYNAMICLIB
DYNAMICLIB BINARYBINARY
23 06/17/10
Dynamic libraries : pmap / exmap
cc$ pmap -x 22388
22388: bashAddress Kbytes RSS Anon Locked Mode Mapping08048000 780 - - - r-x-- bash0810b000 4 - - - r---- bash0810c000 20 - - - rw--- bash
0021f000 1356 - - - r-x-- libc-2.11.1.so00372000 4 - - - ----- libc-2.11.1.so00373000 8 - - - r---- libc-2.11.1.so00375000 4 - - - rw--- libc-2.11.1.so
009a9000 108 - - - r-x-- ld-2.11.1.so009c4000 4 - - - r---- ld-2.11.1.so009c5000 4 - - - rw--- ld-2.11.1.so
00cc4000 24 - - - r-x-- libnss_compat-2.11.1.so00cca000 4 - - - r---- libnss_compat-2.11.1.so00ccb000 4 - - - rw--- libnss_compat-2.11.1.so
00e17000 8 - - - r-x-- libdl-2.11.1.so00e19000 4 - - - r---- libdl-2.11.1.so00e1a000 4 - - - rw--- libdl-2.11.1.so
00f99000 76 - - - r-x-- libnsl-2.11.1.so00fac000 4 - - - r---- libnsl-2.11.1.so00fad000 4 - - - rw--- libnsl-2.11.1.so
bffbe000 84 - - - rw--- [ stack ]-------- ------- ------- ------- -------total kB 9748 - - -
Exemple of result of the memorydump for the bash binary.
The dynamic libraries are load in memoryonly one time even if used by several one.
They are very usefull in order to knowHow many memory is used by a binary.
Another command may be with the exmap.
5. Manual build
25 06/17/10
Manual build
• Compiling : Obtaining an intermediate object :
$ gcc -c my_file.c -o my_file.o
(The –c flag should be used only for individual compilation)
• Linking : Standalone Binary (ELF object)
$ gcc my_file.o -o my_bin
Binary with dynamic link: $ gcc -L<LIB_PATH> -lmy_lib my_file.o -o my_bin
Binary with static link:$ gcc -static -L<LIB_PATH> -lmy_lib my_file.o -o my_bin
$ gcc my_file.o my_lib.a -o my_bin
Debug mode
Parameters
Description
-g Produce debugging information in the operating system's native format (stabs, COFF, XCOFF, or DWARF 2). GDB can work with this debugging information.
-ggdb Produce debugging information for use by GDB. This means to use the most expressive format available (DWARF 2, stabs, or the native format if neither of those are supported), including GDB extensions if at all possible.
-Werror All warning become errors
-Wall Enable all warning
Gcc have a set of parameter that can be used for customizing the debug mode :
For more information : man gcc
Optimization mode
Parameters Description
-O -O1
Optimize. Optimizing compilation takes somewhat more time, and a lot more memory for a large function.
-O2 Optimize even more. GCC performs nearly all supported optimizations that do not involve a space-speed gcc-3.3.2 Last change: 2003-10-16 57 GNU GCC(1) tradeoff. The compiler does not perform loop unrolling or function inlining when you specify -O2. As compared to -O, this option increases both compilation time and the performance of the generated code.
-O3 Optimize yet more. -O3 turns on all optimizations specified by -O2 and also turns on the -finline-functions and -frename-registers options.
-O0 Do not optimize. This is the default.
-Os Optimize for size. -Os enables all -O2 optimizations that do not typically increase code size. It also performs further optimizations designed to reduce code size. -Os disables the following optimization flags: -falign-functions -falign-jumps -falign-loops -falign-labels -freorder-blocks -fprefetch-loop-arrays
Gcc have a set of parameter that can be used for customizing the debug mode :
If multiple -O options, with or without level numbers, the last such option is the one that is effective.
By definition, no optimization should be set (-O0) if the debug mode is used.
Conditional compilation
In order to customize the source code that will be compiled, we can provide to the C preprocessor, some flags for the compilation (CFLAGS) that will enable or not some part of code.
Example with a Part of code to include :#ifdef CONDITIONprintf(“If you see this message, that's mean that this code is enable.“);#endif
In order to say to enable this part of code, we use the following command :$ gcc –c –o test.o –DCONDITION test.cor$ gcc –c –o test.o –DCONDITION=1 test.c
Other C preprocessor commands :#ifdef : if defined#ifndef : if not defined#if : if#elif : else if#endif : end of if or ifdef condition
Example of usage :
#if (__cplusplus==199711L)…#endif
#ifndef OnceTime#define OnceTime
(To include only one time)#endif
Buildin covering with gnu/cov
gcov is a test coverage program. Use it in concert with GCC to analyze your programs to help create more efficient, faster running code and to discover untested parts of your program. You can use gcov as a profiling tool to help discover where your optimization efforts will best affect your code. You can also use gcov along with the other profiling tool, gprof, to assess which parts of your code use the greatest amount of computing time.
Profiling tools help you analyze your code's performance. Using a profiler such as gcov or gprof, you can find out some basic performance statistics, such as:
• how often each line of code executes• what lines of code are actually executed• how much computing time each section of code uses
$ gcc -fprofile-arcs -ftest-coverage -g tableau.c -o tableau$ gcov tableau.c 92.31% of 26 source lines executed in file tableau.cCreating tableau.c.gcov.$ more tableau.c.gcov void permuter_cases (char *tableau, int i1, int i2, int taille) 20 { 20 char inter;
20 if( (i1 < 0) || (i2 < 0) || (i1 >= taille) || (i2 >= taille))
Buildin profiling with gnu/prof 1/3The first step in generating profile information for your program is to compile and link it with profiling enabled. To compile a source file for profiling, specify the `-pg' option when you run the compiler. (This is in addition to the options you normally use.)
To link the program for profiling, if you use a compiler such as cc to do the linking, simply specify `-pg' in addition to your usual options. The same option, `-pg', alters either compilation or linking to do what is necessary for profiling. Here are examples:
$ gcc -g -c myprog.c utils.c -pg compilation $ gcc -o myprog myprog.o utils.o –pg link
The `-pg' option also works with a command that both compiles and links:
$ gcc -o myprog myprog.c utils.c -g -pg
Note: The `-pg' option must be part of your compilation options as well as your link options. If it is not then no call-graph data will be gathered and when you run gprof you will get an error message like this:
gprof: gmon.out file is missing call-graph data
If you add the `-Q' switch to suppress the printing of the call graph data you will still be able to see the time samples:
Flat profile: Each sample counts as 0.01 seconds.
% cumulative self self total time seconds seconds calls ms/call ms/call name 33.34 0.02 0.02 7208 0.00 0.00 open 16.67 0.03 0.01 244 0.04 0.12 offtime 16.67 0.04 0.01 8 1.25 1.25 memccpy 16.67 0.05 0.01 7 1.43 1.43 write
Profiling with gnu/prof 2/3Gprof profiling is similar in some ways to prof profiling. Instead of prof's option -p, the usual option to enable gprof profiling is -pg. The linker links against a different mcount() function which maintains exact counts of entries into each function by individual call sites, probably by walking the stack at run-time to find the address the called function will return to.
The gprof post-processor then constructs the call graph for the program, and propagates function execution time (obtained from the PC sampling) through the call graph, proportionally to the number of calls from each call site for the function. The resulting weighted call graph gives a more thorough picture of inefficiencies in the program; however the call graph may be substantially inaccurate when:
• Propagating execution time meaningfully is difficult when there is recursion (i.e., the call graph is not a tree).
• The heuristic of allocating execution time of a function to its call sites proportionally to the number of calls from each call site fails because different call sites made substantially different demands on the function. E.g., a function might be called equal number of times from location A and B, but the average latency for calls from A might be 100 times longer than the average latency for calls from B; nevertheless, gprof would assign equal amounts of time to be propagated up the call graph to locations A and B.
Sample of results :
Gprof with pthread require some adaptation of the code.
Profiling with gnu/prof 3/3The remaining functions in the listing (those whose self seconds field is 0.00) didn't appear in the histogram samples at all. However, the call graph indicated that they were called, so therefore they are listed, sorted in decreasing order by the calls field. Clearly some time was spent executing these functions, but the paucity of histogram samples prevents any determination of how much time each took.
Here is what the fields in each line mean :% time This is the percentage of the total execution time your program spent
in this function. These should all add up to 100%.
cumulative seconds This is the cumulative total number of seconds the computer spent executing this functions, plus the time spent in all the functions above this one in this table.
self seconds This is the number of seconds accounted for by this function alone. The flat profile listing is sorted first by this number.
calls This is the total number of times the function was called. If the function was never called, or the number of times it was called cannot be determined (probably because the function was not compiled with profiling enabled), the calls field is blank.
self ms/call This represents the average number of milliseconds spent in this function per call, if this function is profiled. Otherwise, this field is blank for this function.
total ms/call This represents the average number of milliseconds spent in this function and its descendants per call, if this function is profiled. Otherwise, this field is blank for this function. This is the only field in the flat profile that uses call graph analysis.
Name This is the name of the function. The flat profile is sorted by this field alphabetically after the self seconds and calls fields are sorted.
06/17/10
6. Building with GNU/MAKE
34 06/17/10
Automatic build using gnu/makeT
he make program gets its dependency "graph" from a text file called makefile or Makefile which resides in the same directory as the source files. Make checks the modification times of the files, and whenever a file becomes "newer" than something that depends on it, (in other words, modified) it runs the compiler accordingly.
Project1 is a target and the name of the final binary. Its dependencies are the three objects data.o, main.o and io.o. Each one have in teir turn their own dependencies that make will resolve one by one.
If you edit io.c, it becomes "newer" than io.o, meaning that make must run cc -c io.c to create a new io.o, then run cc data.o main.o io.o -o project1 for project1.
35 06/17/10
Automatic build using gnu/makeEach dependency shown in the graph is circled with a corresponding color in the Makefile, and each uses the following format:
target : source file(s)command (must be preceded by a tab)
A target given in the Makefile is a file which will be created or updated when any of its source files are modified. The command(s) given in the subsequent line(s) (which must be preceded by a tab character) are executed in order to create the target file.
For more information : http://www.gnu.org/doc/doc.html
36 06/17/10
Automatic build using gnu/makeBy default make will look for a “Makefile” file. We can specify another one through the
–f parameter : $ make –f myMakefile will use myMakefile instead
$ make –C /myProject will use the makefile present in /myProject directory
If not Makefile is found, the make program will display: make: *** No targets specified and no makefile found. Stop.
Make is adapt both for native code and cross-compilation. Adding a specific rules for using a toolchain is very easy:
CC=$(PREFIX)-gcc
.c.o:
$(CC) –c $(CFLAG) –o $@ $<
Can get setting from environment variables or parameters:export TOOLCHAIN_BIN=<PATH_TOOLCHAIN>/bin
export PREFIX=sh4-linux
or
make ARCH=sh4-linux
then launch the “make” command
37 06/17/10
Automatic build using gnu/make
In addition to those macros which you can create yourself, there are a few macros which are used internally by the make program.
Here are some of those, listed below:
You can also manipulate the way these macros are evaluated, as follows, assuming that OBJS = data.o io.o main.o, using $(OBJS:.o=.c) within the Makefile substitutes .o at the end with .c, giving you the following result: data.c io.c main.c
For debugging a Makefile : $ make -d
CC Contains the current C compiler. Defaults to gcc.
CFLAGS Special options which are added to the built-in C rule.
$@ Full name of the current target.
$? A list of files for current dependency which are out-of-date.
$< The source file of the current (single) dependency.
LDFLAGS Special options which are added to the link.
38 06/17/10
Using gnu/make• Compilation and link :
Uses “make” from the command line (it will call the default target such as “all”)
Will look for the current Makefile or another file given with -f)
Doesn't include lot of default rules and everything should be manually implemented
• Installation :
Uses “make install” for the command line
Not in standard … should be done by your own
The install/cp command can be used
• Cleaning:
Uses “make clean” from the command line
Remove temporally and intermediate objects generate
during the build process
• Archive/delivery :
Uses “make delivery” from the command line
Not in standard … should be also done by your own
The tar.gz/tar.bz2/.7z/zip/... compression algorithm can be used
Input fileInput file
Transformation
Output fileOutput file
.c
.o
06/17/10
7. Building with the autotools
40 06/17/10
Automatic build using the autotools
• The autotools are a set of scripts (automake, autoconf, aclocal, autoheader, libtools, … )
• Lot of build-in rules need for transformations (.c→.o, .c→.s, .o→binary, .c→.a, .c→.so, … )
• Developers only writes basic templates (configure.ac and makefile.am)
• Based on gnu/Makefile but almost no need to know how to write them
• Lot of build-in feature :
• Launch the build: make
• Clean the environment: make clean or make distclean
• Install the build into the stagindir: make DESTDIR=<PATH> install
• Generate a archive of the current build: make dist
• Many other useful targets...
Autotool process overview
Makefile.am
configure.ac aclocal.m4
configure
xxx.pc.in
config.h.in
For each package:• provide by the developers• stored in repository
Makefile.in
config.cache
config.log
config.h
Makefile
xxx.pc
autogen.sh
automake
autoheader
aclocal
autoconf
USER VIEWDEV. VIEW
42 06/17/10
Automatic build using the autotools
List of the most useful targets that the GNU Coding Standards specify :
make all Build programs, libraries, documentation, etc. (same as make).
make install Install what needs to be installed, copying the files from the package's tree to system-wide directories.
Same as make install, then strip debugging symbols. Some users like to trade space for useful bug reports...
make install-strip Same as make install, then strip debugging symbols. Some users like to trade space for useful bug reports...
make uninstall The opposite of make install: erase the installed files. (This needs to be run from the same build tree that was installed.)
make clean Erase from the build tree the files built by make all.
make distclean Additionally erase anything ./configure created.
make check Run the test suite, if any.
make installcheck Check the installed programs or libraries, if supported.
make dist Recreate package-version.tar.gz from all the source files.
43 06/17/10
Example 1 : basic project
makefile.am
configure.ac
src/
include/
If you prefer to generate intermediate object in a obj/ directory (or src/), you can move the makefile.am to the directory choosen).
autogen.sh
Contains all the source code
Contains all the headers
Template for the project
Template for the Makefile
Generate the final files
The configure.in are a copy of the configure.ac
44 06/17/10
Example 1 : advanced project
makefile.amconfigure.ac
src/
include/
autogen.sh
Template for the projectMain template for the Makefile
Generate the final files
module1/
module2/
makefile.am Sub-template1 for the Makefile
src/
include/makefile.am Sub-template2 for the
Makefile
45 06/17/10
Using an autotools project at a glance
Then, using this autotools project is simple :
1. Generating the final files from the autotools templates: $ ./autotgen.shmakefile.am Makefile.inconfigure.ac configure
2. Launch the configure script with parameters: $ ./configure –prefix=/usr \ –enable-debug
3. Launch the compilation: $ make
4. Install the files: $ make DESTDIR=/stagingdir install
5. Launch the unitary tests: $ make check
6. Ask to generate the documentation: $ make html
All the intermediate files have a .in extension ; they will be use as input by the configure script for generating all the final files.
46 06/17/10
The configure.ac template
AC_PREREQ(2.59)AC_INIT([pyPackage], [myPackageVersion], [[email protected]])
AC_ARG_ENABLE(debug,AS_HELP_STRING([--enable-debug],
[enable debugging support]),[enable_debug=$enableval],[enable_debug=no]
)if test "$enable_debug" = "yes" ; then
CXXFLAGS="$CXXFLAGS -Wall -ggdb -O0"AC_DEFINE(DEBUG, 1, [Define to enable debug build])
elseCXXFLAGS="$CXXFLAGS -Wall -O2"
fi
PKG_CHECK_MODULES([DIRECTFB],[directfb],[have_libdirectfb=yes],[have_libdirectfb=no])if test "$have_libdirectfb" = no ; then AC_MSG_ERROR([Missing directfb-1.4.1 library!!])fi
AC_OUTPUT(Makefile)
Example of template for a configure.ac
script
47 06/17/10
Managing version
m4_define([gm_os_major_version], [1])m4_define([gm_os_minor_version], [0])m4_define([gm_os_micro_version], [0])m4_define([gm_os_version], [gm_os_major_version.gm_os_minor_version.gm_os_micro_version])AC_INIT([gm_os_posix],[gm_os_version],[[email protected]])
LT_CURRENT=0LT_REVISION=0LT_AGE=0
AC_SUBST(LT_CURRENT)AC_SUBST(LT_REVISION)AC_SUBST(LT_AGE)
GM_OS_MAJOR_VERSION=gm_os_major_versionGM_OS_MINOR_VERSION=gm_os_minor_versionGM_OS_MICRO_VERSION=gm_os_micro_versionGM_OS_VERSION=gm_os_major_version.gm_os_minor_version.gm_os_micro_version
AC_SUBST(GM_OS_MAJOR_VERSION)AC_SUBST(GM_OS_MINOR_VERSION)AC_SUBST(GM_OS_MICRO_VERSION)AC_SUBST(GM_OS_VERSION)AM_INIT_AUTOMAKE(AC_PACKAGE_NAME, AC_PACKAGE_VERSION)
In first step we can managing the version of the package :
48 06/17/10
Checking tools for the build
Then we can check if the tools need for the compilation exist and also their version :
AC_C_CONSTAC_ISC_POSIXAC_HEADER_STDCAC_PROG_CCAC_PROG_CC_STDCAC_PROG_CXXAC_PROG_CPPAC_PROG_LN_SAC_PROG_INSTALLAC_PROG_LIBTOOLAC_PROG_MAKE_SETAC_PATH_PROG([PKG_CONFIG], [pkg-config])if test -z "$PKG_CONFIG" ; then AC_MSG_ERROR([pkg-config not found])fiAC_SUBST(CXX_FOR_BUILD)AM_CONDITIONAL(CROSS_COMPILING, test "$cross_compiling" = "yes")
49 06/17/10
Customization
Configure script have predefined rules and target (they can be redefined) can be customized using macros :
AC_ARG_ENABLE(test, AS_HELP_STRING([--enable-test], [Enable the test unitary support [default=no]]), [case "${enableval}" in yes) have_test=true ;; no) have_test=false ;; *) AC_MSG_ERROR(bad value ${enableval} for --enable-test) ;; esac], [have_test=false])
The listing of the parameters available can be display: $ ./configure --help
boolean switch
AC_ARG_WITH( optim-level, AS_HELP_STRING([--with-optim-level=<0,1,2,3>],[Provide the optim level to give to gcc as -O<level>]), [current_optim_level=$withval], [current_optim_level=0])
Switch with value
Usage: ./configure –enable-test
Usage: ./configure –with-optim-level=2
50 06/17/10
Integrating the debug mode
AC_ARG_ENABLE(debug, AS_HELP_STRING([--enable-debug], [Enable the debug support [default=no]]), [case "${enableval}" in yes) have_debug=true ;; no) have_debug=false ;; *) AC_MSG_ERROR(bad value ${enableval} for --enable-debug) ;; esac], [have_debug=false])AM_CONDITIONAL(HAVE_DEBUG, $have_debug)AC_MSG_CHECKING([Checking the debug support])if test "$have_debug" = "true" ; then
AC_MSG_RESULT([yes])AC_DEFINE(DEBUG, 1, [Define to enable debug mode])DEBUG_CFLAGS=" -ggdb"
DEBUG_CPPFLAGS=" -ggdb" DEBUG_LDFLAGS=" » m4_ifdef([AM_SILENT_RULES],[AM_SILENT_RULES([no])])Else
m4_ifdef([AM_SILENT_RULES],[AM_SILENT_RULES([yes])])AC_MSG_RESULT([no])
fi
Usage: ./configure –enable-debug
If set, it will set all the flags need in debug mode ; il will disable the silent mode.
51 06/17/10
Integrating some optimization
AC_MSG_CHECKING([Optimization level])AC_ARG_WITH( optim-level, AS_HELP_STRING([--with-optim-level=<0,1,2,3>],[Provide the optim level to give to gcc as -O<level>]), [current_optim_level=$withval], [current_optim_level=0])if test "$current_optim_level" != "0" ; then dnl override the default optim level case "$current_optim_level" in "1" | "2" | "3" ) ;; *) AC_MSG_ERROR(bad value ${withval} for the optim-level parameter. It must be a number between 1 and 3.) ;; esacfiif test "$have_debug" = "true" ; then current_optim_level="0" AC_MSG_RESULT([-O$current_optim_level!! Remove all optimization in debug mode.])else AC_MSG_RESULT([-O$current_optim_level])fi
This is possible to say to gcc which optimization level use for one build :
Usage : $ ./configure –with-optim-level=3
By default, no optimization is set ; in debug mode no one is required.
52 06/17/10
Integrating gnu/prof
Usage: ./configure –enable-profiling
AC_ARG_ENABLE(profiling, AS_HELP_STRING([--enable-profiling], [Enable the buildin profiling (gnu/gprof) support [default=no]]), [case "${enableval}" in yes) have_profiling=true ;; no) have_profiling=false ;; *) AC_MSG_ERROR(bad value ${enableval} for --enable-profiling) ;; esac], [have_profiling=false])dnl --------------------------------------------dnl Test the have_profiling variable and if equal to truednl --------------------------------------------AC_MSG_CHECKING([Checking the profiling support])if test "$have_profiling" = "true" ; then dnl -------------------------------------------- dnl Display the result of the test ... yes dnl -------------------------------------------- AC_MSG_RESULT([yes])
dnl -------------------------------------------- dnl Update the CFLAGS, CPPFLAGS and LDFLAGS with gprof options for gcc dnl -------------------------------------------- PROFILING_CFLAGS=" -pg" PROFILING_CPPFLAGS=" -pg" PROFILING_LDFLAGS=" -pg"else dnl -------------------------------------------- dnl Display the result of the test ... no dnl -------------------------------------------- AC_MSG_RESULT([no])fi
If set, it will give –pg both to the CFLAGSand also to the LDFLAGS.
53 06/17/10
Integrating gnu/covdnl --------------------------------------------dnl Brief : enable or disable the buildin covering modednl Mandatory : nodnl Values : none (just enable the covering mode if set)dnl --------------------------------------------AC_ARG_ENABLE(covering, AS_HELP_STRING([--enable-covering], [Enable the building covering (gnu/gcov) support [default=no]]), [case "${enableval}" in yes) have_covering=true ;; no) have_covering=false ;; *) AC_MSG_ERROR(bad value ${enableval} for --enable-covering) ;; esac], [have_covering=false])dnl --------------------------------------------dnl Test the have_covering variable and if equal to truednl --------------------------------------------AC_MSG_CHECKING([Checking the covering support])if test "$have_covering" = "true" ; then dnl -------------------------------------------- dnl Display the result of the test ... yes dnl -------------------------------------------- AC_MSG_RESULT([yes])
dnl -------------------------------------------- dnl Update the CFLAGS and CPPFLAGS with gcov options for gcc dnl -------------------------------------------- COVERING_CFLAGS=" -fprofile-arcs -ftest-coverage" COVERING_CPPFLAGS=" -fprofile-arcs -ftest-coverage" COVERING_LDFLAGS=""else dnl -------------------------------------------- dnl Display the result of the test ... no dnl -------------------------------------------- AC_MSG_RESULT([no])fi
Usage: ./configure –enable-covering
If set, it will give -fprofile-arcs -ftest-coverage to the CFLAGS.
54 06/17/10
Integrating unit test
if HAVE_TESTTESTS = gmos_gtestelseTESTS =endifnoinst_PROGRAMS = $(TESTS)
gmos_gtest_SOURCES = $(top_srcdir)/unit_test/src/test_gm_os.cppgmos_gtest_CPPFLAGS = -I$(HEADER_DIR) \ $(GTEST_CFLAGS) gmos_gtest_LDFLAGS = $(top_srcdir)/.libs/libgm_os.a \ -lpthreadgmos_gtest_LDADD = $(GTEST_LIBS)
AC_ARG_ENABLE(test, AS_HELP_STRING([--enable-test], [Enable the test unitary support [default=no]]), [case "${enableval}" in yes) have_test=true ;; no) have_test=false ;; *) AC_MSG_ERROR(bad value ${enableval} for --enable-test) ;; esac], [have_test=false])AM_CONDITIONAL(HAVE_TEST, $have_test)AC_MSG_CHECKING(Checking the test support)if test "$have_test" != "false" ; then
AC_MSG_RESULT([yes]) PKG_CHECK_MODULES([GTEST],[gtest],[have_libgtest=yes],[have_libgtest=no])
if test "$have_libgtest" = no ; then AC_MSG_ERROR([Missing libgtest library (http://code.google.com/p/googletest/) !!]) fiElse
AC_MSG_RESULT([no])fi
Makefile.am
configure.ac
Usage: ./configure –enable-test
make check
55 06/17/10
Checking dependenciesFor a complete automatic build process, it can be usefull to get metadata from a library (version, name, cflags, ldflags, …. ). This is what pkg-config have bring to the oss community :
Exemple the resolution of a dependancy :
# libglib2.0 dependencyPKG_CHECK_MODULES([GLIB2],[glib-2.0],[have_libglib=yes],[have_libglib=no])dnl make this test is mandatoryif test "$have_libglib" = no ; then AC_MSG_ERROR([Missing libglib-2.0 library])fi
That will check the presence of the .pc file that contains the metdata (usually in /usr/local/lib/pkg-config) : $ pkg-config –exist libname
The PKG_CONFIG_PATH should be set to the directory that contain the .pc files.
Then the M4 macro will export both the CFLAGS and the CFLAGS :$ pkg-config –cflags libname$ pkg-config –libs libname
56 06/17/10
Verbose/silent mode
Building with all details can be usefull in debug mode but it can be unwanted for the releasemode.
There are a definition to say if we want to enable the mode silent or not :
m4_ifdef([AM_SILENT_RULES],[AM_SILENT_RULES([yes/no])])
57 06/17/10
Example of Makefile’s template
makefile.am
HEADER_DIR = $(top_srcdir)/inc
# ----------------------------------------------------------------# Header to be install in the stagingdir (make install)# gm_os is the prefix in <statingdir_path>/usr/include/gm_os/<headers># ----------------------------------------------------------------lib_includedir = $(includedir)/gm_os/
lib_include_HEADERS = $(HEADER_DIR)/gm_os_types.h \ $(HEADER_DIR)/gm_os_trace.h \ $(HEADER_DIR)/gm_os.h
# ----------------------------------------------------------------# Use for the installation in the stagingdir# ----------------------------------------------------------------pkgconfigdir = $(libdir)/pkgconfigpkgconfig_DATA = gm_os.pc
# ----------------------------------------------------------------# Compilation and generation of the gm_os library# ----------------------------------------------------------------lib_LTLIBRARIES = libgm_os.lalibgm_os_la_SOURCES = src/gm_os_misc.c \ src/gm_os_heap.c \ src/gm_os_message.c
libgm_os_la_CFLAGS = -I$(HEADER_DIR) \ $(GLIB2_CFLAGS) \ $(MY_DEBUG_CFLAG)
libgm_os_la_LDFLAGS = -version-info $(LT_CURRENT):$(LT_REVISION):$(LT_AGE) \ -lpthread \ $(MY_DEBUG_LDFLAGS)
libgm_os_la_LIBADD = $(GLIB2_LIBS)
58 06/17/10
Doxygen documentation
The documentation is automaticaly generate at the doxygen format (html). For this generation a generic doxygen configuration is used (default-Doxyfile)and is upgrade by the makefile before to launch the documentation:
DOXYFILE ?= $(srcdir)/default-Doxyfilehtml-local: @echo -n "Removing the previous documentation : " @ rm -fR ./html @echo "OK" @echo -n "Preparing the documentation requirement : " @cp $(DOXYFILE) Doxyfile @chmod +w Doxyfile @echo "PROJECT_NAME=@PACKAGE_NAME@" >> Doxyfile @echo "PROJECT_NUMBER=@PACKAGE_VERSION@" >> Doxyfile @echo "INPUT=$(srcdir)/inc" >> Doxyfile @echo "OK" @echo -n "Generating the documentation in doxygen format : " @doxygen > /dev/null 2>&1 @rm -f Doxyfile @echo "OK"
Usage: make html
That will create a new html directory that contain the documentation. The main html fileis index.html.
makefile.am
59 06/17/10
gnu/make/autotools vs tmake
tmake The autotools
Need to provide the tmakeengine package No documentation No a lot of people know how it works No really actually supported(Not maintain anymore)
Use GNU/software such as the binutils Need to know the template's syntax but:
Is well documented (internet, books, … ) Is well supported by the community Lot of people know them Is becoming a standard in the global oss community
Templates make software standalone and portables
60 06/17/10
Conclusion as a summary
What the developer should do ?
What the user should do ?
Write configure.ac script need for generating final gnu/Makefile
Write basic Makefile.am using the autotools
Setting : ./configure –prefix=<my_path> ...
Compilation : make
Installation : make DESTDIR=<stagindir> install
Delivery : make dist (will make an archive)
Clean : make clean or make distclean
06/17/10
Any question ?