mirror of
https://gitlab.gnome.org/GNOME/libxml2.git
synced 2025-10-26 00:37:43 +03:00
Makefile.am:
* Don't use @VAR@, use $(VAR). Autoconf's AC_SUBST provides us the Make
variable, it allows overriding the value at the command line, and
(notably) it avoids a Make parse error in the libxml2_la_LDFLAGS
assignment when @MODULE_PLATFORM_LIBS@ is empty
* Changed how the THREADS_W32 mechanism switches the build between
testThreads.c and testThreadsWin32.c as appropriate; using AM_CONDITIONAL
allows this to work cleanly and plays well with dependencies
* testapi.c should be specified as BUILT_SOURCES
* Create symlinks to the test/ and result/ subdirs so that the runtests
target is usable in out-of-source-tree builds
* Don't do MAKEFLAGS+=--silent as this is not portable to non-GNU Makes
* Fixed incorrect find(1) syntax in the "cleanup" rule, and doing "rm -f"
instead of just "rm" is good form
* (DIST)CLEANFILES needed a bit more coverage to allow "make distcheck" to
pass
configure.in:
* Need AC_PROG_LN_S to create test/ and result/ symlinks in Makefile.am
* AC_LIBTOOL_WIN32_DLL and AM_PROG_LIBTOOL are obsolete; these have been
superceded by LT_INIT
* Don't rebuild docs by default, as this requires GNU Make (as
implemented)
* Check for uint32_t as some platforms don't provide it
* Check for some more functions, and undefine HAVE_MMAP if we don't also
HAVE_MUNMAP (one system I tested on actually needed this)
* Changed THREADS_W32 from a filename insert into an Automake conditional
* The "Copyright" file will not be in the current directory if builddir !=
srcdir
doc/Makefile.am:
* EXTRA_DIST cannot use wildcards when they refer to generated files; this
breaks dependencies. What I did was define EXTRA_DIST_wc, which uses GNU
Make $(wildcard) directives to build up a list of files, and EXTRA_DIST,
as a literal expansion of EXTRA_DIST_wc. I also added a new rule,
"check-extra-dist", to simplify checking that the two variables are
equivalent. (Note that this works only when builddir == srcdir)
(I can implement this differently if desired; this is just one way of
doing it)
* Don't define an "all" target; this steps on Automake's toes
* Fixed up the "libxml2-api.xml ..." rule by using $(wildcard) for
dependencies (as Make doesn't process the wildcards otherwise) and
qualifying appropriate files with $(srcdir)
(Note that $(srcdir) is not needed in the dependencies, thanks to VPATH,
which we can count on as this is GNU-Make-only code anyway)
doc/devhelp/Makefile.am:
* Qualified appropriate files with $(srcdir)
* Added an "uninstall-local" rule so that "make distcheck" passes
doc/examples/Makefile.am:
* Rather than use a wildcard that doesn't work, use a substitution that
most Make programs can handle
doc/examples/index.py:
* Do the same here
include/libxml/nanoftp.h:
* Some platforms (e.g. MSVC 6) already #define INVALID_SOCKET:
user@host:/cygdrive/c/Program Files/Microsoft Visual Studio/VC98/\
Include$ grep -R INVALID_SOCKET .
./WINSOCK.H:#define INVALID_SOCKET (SOCKET)(~0)
./WINSOCK2.H:#define INVALID_SOCKET (SOCKET)(~0)
include/libxml/xmlversion.h.in:
* Support ancient GCCs (I was actually able to build the library with 2.5
but for this bit)
python/Makefile.am:
* Expanded CLEANFILES to allow "make distcheck" to pass
python/tests/Makefile.am:
* Define CLEANFILES instead of a "clean" rule, and added tmp.xml to allow
"make distcheck" to pass
testRelax.c:
* Use HAVE_MMAP instead of the less explicit HAVE_SYS_MMAN_H (as some
systems have the header but not the function)
testSchemas.c:
* Use HAVE_MMAP instead of the less explicit HAVE_SYS_MMAN_H
testapi.c:
* Don't use putenv() if it's not available
threads.c:
* This fixes the following build error on Solaris 8:
libtool: compile: cc -DHAVE_CONFIG_H -I. -I./include -I./include \
-D_REENTRANT -D__EXTENSIONS__ -D_REENTRANT -Dsparc -Xa -mt -v \
-xarch=v9 -xcrossfile -xO5 -c threads.c -KPIC -DPIC -o threads.o
"threads.c", line 442: controlling expressions must have scalar type
"threads.c", line 512: controlling expressions must have scalar type
cc: acomp failed for threads.c
*** Error code 1
trio.c:
* Define isascii() if the system doesn't provide it
trio.h:
* The trio library's HAVE_CONFIG_H header is not the same as LibXML2's
HAVE_CONFIG_H header; this change is needed to avoid a double-inclusion
win32/configure.js:
* Added support for the LZMA compression option
win32/Makefile.{bcb,mingw,msvc}:
* Added appropriate bits to support WITH_LZMA=1
* Install the header files under $(INCPREFIX)\libxml2\libxml instead of
$(INCPREFIX)\libxml, to mirror the install location on Unix+Autotools
xml2-config.in:
* @MODULE_PLATFORM_LIBS@ (usually "-ldl") needs to be in there in order for
`xml2-config --libs` to provide a complete set of dependencies
xmllint.c:
* Use HAVE_MMAP instead of the less-explicit HAVE_SYS_MMAN_H
299 lines
8.7 KiB
Python
Executable File
299 lines
8.7 KiB
Python
Executable File
#!/usr/bin/python -u
|
|
#
|
|
# Indexes the examples and build an XML description
|
|
#
|
|
import string
|
|
import glob
|
|
import sys
|
|
try:
|
|
import libxml2
|
|
except:
|
|
sys.exit(1)
|
|
sys.path.insert(0, "..")
|
|
from apibuild import CParser, escape
|
|
|
|
examples = []
|
|
extras = ['examples.xsl', 'index.py']
|
|
tests = []
|
|
sections = {}
|
|
symbols = {}
|
|
api_dict = None
|
|
api_doc = None
|
|
|
|
def load_api():
|
|
global api_dict
|
|
global api_doc
|
|
|
|
if api_dict != None:
|
|
return
|
|
api_dict = {}
|
|
try:
|
|
print "loading ../libxml2-api.xml"
|
|
api_doc = libxml2.parseFile("../libxml2-api.xml")
|
|
except:
|
|
print "failed to parse ../libxml2-api.xml"
|
|
sys.exit(1)
|
|
|
|
def find_symbol(name):
|
|
global api_dict
|
|
global api_doc
|
|
|
|
if api_doc == None:
|
|
load_api()
|
|
|
|
if name == None:
|
|
return
|
|
if api_dict.has_key(name):
|
|
return api_dict[name]
|
|
ctxt = api_doc.xpathNewContext()
|
|
res = ctxt.xpathEval("/api/symbols/*[@name = '%s']" % (name))
|
|
if type(res) == type([]) and len(res) >= 1:
|
|
if len(res) > 1:
|
|
print "Found %d references to %s in the API" % (len(res), name)
|
|
node = res[0]
|
|
typ = node.name
|
|
file = node.xpathEval("string(@file)")
|
|
info = node.xpathEval("string(info)")
|
|
else:
|
|
print "Reference %s not found in the API" % (name)
|
|
return None
|
|
ret = (typ, file, info)
|
|
api_dict[name] = ret
|
|
return ret
|
|
|
|
def parse_top_comment(filename, comment):
|
|
res = {}
|
|
lines = string.split(comment, "\n")
|
|
item = None
|
|
for line in lines:
|
|
while line != "" and (line[0] == ' ' or line[0] == '\t'):
|
|
line = line[1:]
|
|
while line != "" and line[0] == '*':
|
|
line = line[1:]
|
|
while line != "" and (line[0] == ' ' or line[0] == '\t'):
|
|
line = line[1:]
|
|
try:
|
|
(it, line) = string.split(line, ":", 1)
|
|
item = it
|
|
while line != "" and (line[0] == ' ' or line[0] == '\t'):
|
|
line = line[1:]
|
|
if res.has_key(item):
|
|
res[item] = res[item] + " " + line
|
|
else:
|
|
res[item] = line
|
|
except:
|
|
if item != None:
|
|
if res.has_key(item):
|
|
res[item] = res[item] + " " + line
|
|
else:
|
|
res[item] = line
|
|
return res
|
|
|
|
def parse(filename, output):
|
|
global symbols
|
|
global sections
|
|
|
|
parser = CParser(filename)
|
|
parser.collect_references()
|
|
idx = parser.parse()
|
|
info = parse_top_comment(filename, parser.top_comment)
|
|
output.write(" <example filename='%s'>\n" % filename)
|
|
try:
|
|
synopsis = info['synopsis']
|
|
output.write(" <synopsis>%s</synopsis>\n" % escape(synopsis));
|
|
except:
|
|
print "Example %s lacks a synopsis description" % (filename)
|
|
try:
|
|
purpose = info['purpose']
|
|
output.write(" <purpose>%s</purpose>\n" % escape(purpose));
|
|
except:
|
|
print "Example %s lacks a purpose description" % (filename)
|
|
try:
|
|
usage = info['usage']
|
|
output.write(" <usage>%s</usage>\n" % escape(usage));
|
|
except:
|
|
print "Example %s lacks an usage description" % (filename)
|
|
try:
|
|
test = info['test']
|
|
output.write(" <test>%s</test>\n" % escape(test));
|
|
progname=filename[0:-2]
|
|
command=string.replace(test, progname, './' + progname, 1)
|
|
tests.append(command)
|
|
except:
|
|
pass
|
|
try:
|
|
author = info['author']
|
|
output.write(" <author>%s</author>\n" % escape(author));
|
|
except:
|
|
print "Example %s lacks an author description" % (filename)
|
|
try:
|
|
copy = info['copy']
|
|
output.write(" <copy>%s</copy>\n" % escape(copy));
|
|
except:
|
|
print "Example %s lacks a copyright description" % (filename)
|
|
try:
|
|
section = info['section']
|
|
output.write(" <section>%s</section>\n" % escape(section));
|
|
if sections.has_key(section):
|
|
sections[section].append(filename)
|
|
else:
|
|
sections[section] = [filename]
|
|
except:
|
|
print "Example %s lacks a section description" % (filename)
|
|
for topic in info.keys():
|
|
if topic != "purpose" and topic != "usage" and \
|
|
topic != "author" and topic != "copy" and \
|
|
topic != "section" and topic != "synopsis" and topic != "test":
|
|
str = info[topic]
|
|
output.write(" <extra topic='%s'>%s</extra>\n" % (
|
|
escape(topic), escape(str)))
|
|
output.write(" <includes>\n")
|
|
for include in idx.includes.keys():
|
|
if include.find("libxml") != -1:
|
|
output.write(" <include>%s</include>\n" % (escape(include)))
|
|
output.write(" </includes>\n")
|
|
output.write(" <uses>\n")
|
|
for ref in idx.references.keys():
|
|
id = idx.references[ref]
|
|
name = id.get_name()
|
|
line = id.get_lineno()
|
|
if symbols.has_key(name):
|
|
sinfo = symbols[name]
|
|
refs = sinfo[0]
|
|
# gather at most 5 references per symbols
|
|
if refs > 5:
|
|
continue
|
|
sinfo.append(filename)
|
|
sinfo[0] = refs + 1
|
|
else:
|
|
symbols[name] = [1, filename]
|
|
info = find_symbol(name)
|
|
if info != None:
|
|
type = info[0]
|
|
file = info[1]
|
|
output.write(" <%s line='%d' file='%s' name='%s'/>\n" % (type,
|
|
line, file, name))
|
|
else:
|
|
type = id.get_type()
|
|
output.write(" <%s line='%d' name='%s'/>\n" % (type,
|
|
line, name))
|
|
|
|
output.write(" </uses>\n")
|
|
output.write(" </example>\n")
|
|
|
|
return idx
|
|
|
|
def dump_symbols(output):
|
|
global symbols
|
|
|
|
output.write(" <symbols>\n")
|
|
keys = symbols.keys()
|
|
keys.sort()
|
|
for symbol in keys:
|
|
output.write(" <symbol name='%s'>\n" % (symbol))
|
|
info = symbols[symbol]
|
|
i = 1
|
|
while i < len(info):
|
|
output.write(" <ref filename='%s'/>\n" % (info[i]))
|
|
i = i + 1
|
|
output.write(" </symbol>\n")
|
|
output.write(" </symbols>\n")
|
|
|
|
def dump_sections(output):
|
|
global sections
|
|
|
|
output.write(" <sections>\n")
|
|
keys = sections.keys()
|
|
keys.sort()
|
|
for section in keys:
|
|
output.write(" <section name='%s'>\n" % (section))
|
|
info = sections[section]
|
|
i = 0
|
|
while i < len(info):
|
|
output.write(" <example filename='%s'/>\n" % (info[i]))
|
|
i = i + 1
|
|
output.write(" </section>\n")
|
|
output.write(" </sections>\n")
|
|
|
|
def dump_Makefile():
|
|
for file in glob.glob('*.xml'):
|
|
extras.append(file)
|
|
for file in glob.glob('*.res'):
|
|
extras.append(file)
|
|
Makefile="""# Beware this is autogenerated by index.py
|
|
INCLUDES = -I$(top_builddir)/include -I$(top_srcdir)/include -I@srcdir@/include @THREAD_CFLAGS@ @Z_CFLAGS@
|
|
DEPS = $(top_builddir)/libxml2.la
|
|
LDADDS = @STATIC_BINARIES@ $(top_builddir)/libxml2.la @THREAD_LIBS@ @Z_LIBS@ $(ICONV_LIBS) -lm @WIN32_EXTRA_LIBADD@
|
|
|
|
rebuild: examples.xml index.html
|
|
|
|
examples.xml: index.py $(noinst_PROGRAMS:=.c)
|
|
-@($(srcdir)/index.py)
|
|
|
|
index.html: examples.xml examples.xsl
|
|
-@(xsltproc examples.xsl examples.xml && echo "Rebuilt web page" && xmllint --valid --noout index.html)
|
|
|
|
install-data-local:
|
|
$(mkinstalldirs) $(DESTDIR)$(HTML_DIR)
|
|
-@INSTALL@ -m 0644 $(srcdir)/*.html $(srcdir)/*.c $(srcdir)/*.xml $(srcdir)/*.xsl $(srcdir)/*.res $(DESTDIR)$(HTML_DIR)
|
|
|
|
"""
|
|
EXTRA_DIST=""
|
|
for extra in extras:
|
|
EXTRA_DIST = EXTRA_DIST + extra + " "
|
|
Makefile = Makefile + "EXTRA_DIST=%s\n\n" % (EXTRA_DIST)
|
|
noinst_PROGRAMS=""
|
|
for example in examples:
|
|
noinst_PROGRAMS = noinst_PROGRAMS + example + " "
|
|
Makefile = Makefile + "noinst_PROGRAMS=%s\n\n" % (noinst_PROGRAMS)
|
|
for example in examples:
|
|
Makefile = Makefile + "%s_SOURCES=%s.c\n%s_LDFLAGS=\n%s_DEPENDENCIES= $(DEPS)\n%s_LDADD= @RDL_LIBS@ $(LDADDS)\n\n" % (example, example, example,
|
|
example, example)
|
|
Makefile = Makefile + "valgrind: \n\t$(MAKE) CHECKER='valgrind' tests\n\n"
|
|
Makefile = Makefile + "tests: $(noinst_PROGRAMS)\n"
|
|
Makefile = Makefile + "\t@(echo '## examples regression tests')\n"
|
|
Makefile = Makefile + "\t@(echo > .memdump)\n"
|
|
for test in tests:
|
|
Makefile = Makefile + "\t@($(CHECKER) %s)\n" % (test)
|
|
Makefile = Makefile + '\t@(grep "MORY ALLO" .memdump | grep -v "MEMORY ALLOCATED : 0" ; exit 0)\n'
|
|
Makefile = Makefile + "\n\n"
|
|
try:
|
|
old = open("Makefile.am", "r").read()
|
|
if old != Makefile:
|
|
n = open("Makefile.am", "w").write(Makefile)
|
|
print "Updated Makefile.am"
|
|
except:
|
|
print "Failed to read or save Makefile.am"
|
|
#
|
|
# Autogenerate the .cvsignore too ...
|
|
#
|
|
ignore = """.memdump
|
|
Makefile.in
|
|
Makefile
|
|
"""
|
|
for example in examples:
|
|
ignore = ignore + "%s\n" % (example)
|
|
try:
|
|
old = open(".cvsignore", "r").read()
|
|
if old != ignore:
|
|
n = open(".cvsignore", "w").write(ignore)
|
|
print "Updated .cvsignore"
|
|
except:
|
|
print "Failed to read or save .cvsignore"
|
|
|
|
if __name__ == "__main__":
|
|
load_api()
|
|
output = open("examples.xml", "w")
|
|
output.write("<examples>\n")
|
|
|
|
for file in glob.glob('*.c'):
|
|
parse(file, output)
|
|
examples.append(file[:-2])
|
|
|
|
dump_symbols(output)
|
|
dump_sections(output)
|
|
output.write("</examples>\n")
|
|
output.close()
|
|
dump_Makefile()
|
|
|