summaryrefslogtreecommitdiffstats
diff options
context:
space:
mode:
authorKr1ss2019-12-05 22:09:28 +0100
committerKr1ss2019-12-05 22:09:28 +0100
commitb7fa4b3c65e39a024e472fb02bdc56d48a27035f (patch)
treea2e9088e16e911d61cab5680eb19bbd03610d53b
parent2c7e18124e9575432212763926e1579c266b6a17 (diff)
downloadpackages-b7fa4b3c65e39a024e472fb02bdc56d48a27035f.tar.gz
packages-b7fa4b3c65e39a024e472fb02bdc56d48a27035f.tar.bz2
packages-b7fa4b3c65e39a024e472fb02bdc56d48a27035f.zip
adopt package & update: wapiti 3.0.2-1
upstream release
-rw-r--r--.SRCINFO15
-rw-r--r--ChangeLog486
-rw-r--r--PKGBUILD41
3 files changed, 526 insertions, 16 deletions
diff --git a/.SRCINFO b/.SRCINFO
index 0a6b7c5..ba75a99 100644
--- a/.SRCINFO
+++ b/.SRCINFO
@@ -1,12 +1,12 @@
1pkgbase = wapiti 1pkgbase = wapiti
2 pkgdesc = A vulnerability scanner for web applications. It currently search vulnerabilities like XSS, SQL and XPath injections, file inclusions, command execution, LDAP injections, CRLF injections... 2 pkgdesc = A comprehensive web app vulnerability scanner written in Python
3 pkgver = 3.0.1 3 pkgver = 3.0.2
4 pkgrel = 1 4 pkgrel = 1
5 url = http://wapiti.sourceforge.net/ 5 url = http://wapiti.sourceforge.net/
6 changelog = ChangeLog
6 arch = any 7 arch = any
7 license = GPL 8 license = GPL
8 depends = python 9 makedepends = python-setuptools
9 depends = python-setuptools
10 depends = python-requests 10 depends = python-requests
11 depends = python-beautifulsoup4 11 depends = python-beautifulsoup4
12 depends = python-lxml 12 depends = python-lxml
@@ -14,8 +14,11 @@ pkgbase = wapiti
14 depends = python-yaswfp 14 depends = python-yaswfp
15 depends = python-mako 15 depends = python-mako
16 depends = python-pysocks 16 depends = python-pysocks
17 source = http://downloads.sourceforge.net/sourceforge/wapiti/wapiti/wapiti-3.0.1/wapiti3-3.0.1.tar.gz 17 optdepends = python-requests-kerberos: Kerberos authentication
18 sha256sums = bbb8c8f572afe77319734489a6ca0b211df4b87ad294db79b8bf0bda1c5aff29 18 optdepends = python-requests-ntlm: NTLM authentication
19 options = zipman
20 source = http://downloads.sourceforge.net/sourceforge/wapiti/wapiti/wapiti-3.0.2/wapiti3-3.0.2.tar.gz
21 sha256sums = df86cab9f66c7794cab54fede16029056a764f5da565b2695524f9bd2bc9a384
19 22
20pkgname = wapiti 23pkgname = wapiti
21 24
diff --git a/ChangeLog b/ChangeLog
new file mode 100644
index 0000000..25d0b58
--- /dev/null
+++ b/ChangeLog
@@ -0,0 +1,486 @@
102/09/2019
2 Wapiti 3.0.2
3 New XXE module cans end payloads in parameters, query string, file uploads and raw body.
4 New module for detection Open Redirect vulnerabilities (header based our HTML meta based or JS based).
5 Fixed domain scope scanning.
6 Reduced false positives in attack modules (specially time based ones).
7 Reduced invalid links generated by js analysis and ignore obviously malformed HTML links.
8 Do not crawl CSS files and remove query strings from JS files when crawling.
9 Improved and changed existing payloads.
10 Improved extracting forms from HTML pages (radio buttons / select, ...)
11 Support for more POST enctypes (sending XML or JSON for example, currently only leveraged by mod_xxe)
12 --store-session option allow to specify a path where .db and .pkl files are stored.
13 --endpoint --internal-endpoint --external-endpoint options to set your own endpoint and receive requests from target
14 Authentications options can now be used with wapiti-getcookie.
15 Js parser can now deal with HTML comments.
16 More comprehensive choices when doing Ctrl+C during scan (eg: 'c' to continue, 'q' to quit)
17 Fixed lot of bugs thank to received crash dumps.
18
1911/05/2018
20 Wapiti 3.0.1
21 New module mod_methods to detect interesting methods which might be allowed by scripts (PUT, PROPFIND, etc)
22 New module mod_ssrf to detect Server Side Request Forgery vulnerabilities (requires Internet access)
23 Improved mod_xss and mod_permanentxss modules to reduce false positives.
24 Changed some XSS payloads for something more visual (banner at top the the webpage).
25 Changed bug reporting URL.
26 Fixed issue #54 in lamejs JS parser.
27 Removed lxml and libxml2 as a dependency. That parser have difficulties to parse exotic encodings.
28
2903/01/2018
30 Release of Wapiti 3.0.0
31
3202/01/2018
33 Added --list-modules and --resume-crawl options.
34
3523/12/2017
36 Ported to Python3.
37 Persister rewritten to use sqlite3 databases (for session management).
38 Added ascii-art because you know... it's an attack tool so it's required feature.
39 Changed output format (stdout) to something more like sqlmap output.
40 python-lxml and libxml2 are required dependencies unless you opt-out with --with-html5lib at setup.
41 SOCKS5 proxy support is back.
42 New -u mandatory option must be use to specify the base URL.
43 Added -d (--depth) option to limit the maximum depth of links following.
44 Added -H (--header) option to add HTTP headers to every request.
45 Added -A (--user-agent) option to set the User-Agent string.
46 Added --skip option to skip parameters during attacks.
47 Added -S (--scan-force) option to control the ammount of requests sent for attacks.
48 Added --max-parameters to not attack URLs anf forms having more than X input parameters.
49 Added -l (--level) option to allow attacking query strings without parameters.
50 Added --max-scan-time option to stop the scan after the given amount of minutes.
51 Added a buster module for directory and file busting.
52 Added a Shellshock detection module.
53 Added buitin list of well known parameters to skip during attack.
54 More control on execution flow when KeyboardInterrupt is triggered.
55 Reduced false-positives situations on time-based attacks (mainly blind_sql)
56 Replace getopt for argparse.
57 Fixed bugs related to obtaining user's locale (issue #20).
58 Enhancement to support new CVE notation [issue 37).
59 Can now report minor issues (notices) besides anomalies and vulnerabilities.
60 Added mod_delay module to report time consuming webpages.
61 Renamed some options (should be easier to remember).
62 More exec, file, xss payloads.
63 Fixed a bug with JSON cookie management for IPv6 addresses and custom ports.
64 XSS attack module can escape HTML comments for payload generation.
65 Fixed -r issue on URLs having only one parameter.
66 No SSL/TLS check by default (--verify-ssl behavior).
67 Added a Mutator class for easy payload injection in parameters.
68 Rewrote report generators, added Mako as a dependency for HTML reports. Less JS.
69 Crash report are send to a website, opt-out with --no-bugreport.
70 Improvements on backup, sql and exec modules submitted by Milan Bartos.
71 Payload files can now include special flags that will be interpreted by Wapiti.
72 wapiti-cookie and wapiti-getcookie were merged in a new wapiti-getcookie tool.
73
74
7520/10/2013
76 Version 2.3.0
77 Fixed a colosseum of bugs, especially related to unicode.
78 Software is much more stable.
79 New report template for HTML (using Kube CSS).
80 Using v2.1.5 of Nikto database for mod_nikto.
81 Replaced httplib2 with (python-)requests for everything related to HTTP.
82 Remove BeautifulSoup from package. It is still required however.
83 Core rewrite (PEP8 + more Pythonic)
84 New payloads for the backup, XSS, blind SQL, exec and file modules + more
85 detection rules.
86 So many improvements on lswww (crawler) that I can't make a list here. But
87 Wapiti reached 48% on Wivet.
88 Wapiti cookie format is now based on JSON.
89 Removed SOCKS proxy support (you will have to use a HTTP to SOCKS proxy).
90 Added a HTTPResource class for easier module creation.
91 Code restructuration for better setup.
92 Attack of parameters in query string even for HTTP POST requests.
93 Attack on file uploads (injection in file names).
94 Simpler (and less buggy) colored output with -c.
95 A CURL PoC is given for each vulnerability/anomaly found + raw HTTP
96 request representation in reports.
97 No more parameter reordering + can handle parameters repetition.
98 Added a JSON report generator + fixed the HTML report generator.
99 Added an option to not check SSL certificates.
100 mod_xss : noscipt tag escaping.
101 Can work on parameters that don't have a value in query string.
102 mod_crlf is not activated by default anymore (must call it with -m).
103 Startings URLs (-s) will be fetched even if out of scope.
104 Proxy support for wapiti-getcookie. and wapiti-cookie.
105 Attempt to bring an OpenVAS report generator.
106 Added an home-made SWF parser to extract URLs from flash files.
107 Added an home-made (and more than basic) JS interpreter based on the
108 pynarcissus parser. Lot of work still needs to be done on this.
109 New logo and webpage at wapiti.sf.net.
110 Added german and malaysian translations.
111 Added a script to create standalone archive for Windows (with py2exe).
112
11329/12/2009
114 Version 2.2.1 (already)
115 Bugfixes only
116 Fixed a bug in lswww if root url is not given complete.
117 Fixed a bug in lswww with a call to BeautifulSoup made on non text files.
118 Fixed a bug that occured when verbosity = 2. Unicode error on stderr.
119 Check the document's content-type and extension before attacking files on
120 the query string.
121 Added a timeout check in the nikto module when downloading the database.
122
12328/12/2009
124 Version 2.2.0
125 Added a manpage.
126 Internationalization : translations of Wapiti in spanish and french.
127 Options -k and -i allow the scan to be saved and restored later.
128 Added option -b to set the scope of the scan based on the root url given.
129 Wrote a library to save handle cookies and save them in XML format.
130 Modules are now loaded dynamically with a dependency system.
131 Rewrote the -m option used to activate / deactivate attack modules.
132 New module to search for backup files of scripts on the target webserver.
133 New module to search for weakly configured .htaccess.
134 New module to search dangerous files based on the Nikto database.
135 Differ "raw" XSS from "urlencoded" XSS.
136 Updated BeautifulSoup to version 3.0.8.
137 Better encoding support for webpages (convert to Unicode)
138 Added "resource consumption" as a vulnerability type.
139 Fixed bug ID 2779441 "Python Version 2.5 required?"
140 Fixed bug with special characters in HTML reports.
141
14205/04/2008
143 Added more patterns for file handling vulnerabilities in PHP.
144 Added GET_SQL and POST_SQL as modules (-m) for attacks.
145 Modifier getcookie.py and cookie.py so they try to get the cookies
146 even if cookielib fails.
147
14827/03/2007
149 Updated ChangeLogs
150
15126/03/2009
152 Fixed bug ID 2433127. Comparison was made with HTTP error codes
153 on numeric values but httplib2 return the status code as a string.
154 Forbid httplib2 to handle HTTP redirections. Wapiti and lswww will
155 take care of this (more checks on urls...)
156 Fixed a bug with Blind SQL attacks (the same attack could be launched
157 several times)
158 Fixed an error in blindSQLPayloads.txt.
159 Changed the error message when Wapiti don't get any data from lswww.
160 Verifications to be sure blind SQL attacks won't be launched if "standard"
161 SQL attacks works.
162
16325/03/2009
164 Exported blind SQL payloads from the code. Now in config file
165 blindSQLPayloads.txt.
166 Set timeout for time-based BSQL attacks to timetout used for HTTP
167 requests + 1 second.
168 Added Blind SQL as a type of vulnerability in the report generator.
169 More verbosity for permanent XSS scan.
170 More docstrings.
171 Updated the REAME.
172
17324/03/2009
174 Added some docstring to the code.
175 Removed warnign on alpha code.
176 First Blind SQL Injection implementation in Wapiti.
177 Fixed some timeout errors.
178
17922/03/2009
180 Fixed character encoding error in sql injection module.
181 Changed the md5 and sha1 import in httplib2 to hashlib.
182
18328/11/2008
184 Google Charts API is added to generate the charts of the reports.
185
18615/11/2008
187 Re-integration of standard HTTP proxies in httplib2.
188 Integration of HTTP CONNECT tunneling in Wapiti.
189 Fixed bug ID 2257654 "getcookie.py error missing action in html form"
190
19102/11/2008
192 Integraded the proxy implementation of httplib2 in Wapiti.
193 Can now use SOCKSv5 and SOCKSv4 proxies.
194
19522/10/2008
196 Fixed a bug with Cookie headers.
197
19819/10/2008
199 Remplaced urllib2 by httplib2.
200 Wapiti now use persistent HTTP connections, speed up the scan.
201 Included a python SOCKS library.
202
20309/10/2008
204 Version 2.0.0-beta
205 Added the possibility to generate reports of the vulnerabilities found
206 in HTML, XML or plain-text format. See options -o and -f.
207 HTTP authentification now works.
208 Added the option -n (or --nice) to prevent endless loops during scanning.
209 More patterns for SQL vulnerability detection
210 Code refactoring : more clear and more object-oriented
211 New XSS function is now fully implemented
212 The payloads have been separated from the code into configuration files.
213 Updated BeautifulSoup
214
21515/09/2008
216 Version 1.1.7-alpha
217 Use GET method if not specified in "method" tag
218 Keep an history of XSS payloads
219 New XSS engine for GET method using a list of payloads to bypass filters
220 New module HTTP.py for http requests
221 Added fpassthru to file handling warnings
222 Added a new new detection string for MS-SQL, submitted by Joe McCray
223
22428/01/2007
225 Version 1.1.6
226 New version of lswww
227
22824/10/2006
229 Version 1.1.5
230 Wildcard exclusion with -x (--exclude) option
231
23222/10/2006
233 Fixed a typo in wapiti.py (setAuthCreddentials : one 'd' is enough)
234 Fixed a bug with set_auth_credentials.
235
23607/10/2006
237 Version 1.1.4
238 Some modifications have been made on getccokie.py so it can work
239 on Webmin (and probably more web applications)
240 Added -t (--timeout) option to set the timeout in seconds
241 Added -v (--verbose) option to set the verbosity. Three availables
242 modes :
243 0: only print found vulnerabilities
244 1: print current attacked urls (existing urls)
245 2: print every attack payload and url (very much informations... good
246 for debugging)
247 Wapiti is much more modular and comes with some functions to set scan
248 and attack options... look the code ;)
249 Some defaults options are availables as "modules" with option -m
250 (--module) :
251 GET_XSS: only scan for XSS with HTTP GET method (no post)
252 POST_XSS: XSS attacks using POST and not GET
253 GET_ALL: every attack without POST requests
254
25512/08/2006
256 Version 1.1.3
257 Fixed the timeout bug with chunked responses
258 (ID = 1536565 on SourceForge)
259
26009/08/2006
261 Version 1.1.2
262 Fixed a bug with HTTP 500 and POST attacks
263
26405/08/2006
265 Version 1.1.1
266 Fixed the UnboundLocalError due to socket timeouts
267 (bug ID = 1534415 on SourceForge)
268
26927/07/2006
270 Version 1.1.0 with urllib2
271 Detection string for mysql_error()
272 Changed the mysql payload (see http://shiflett.org/archive/184 )
273 Modification of the README file
274
27522/07/2006
276 Added CRLF Injection.
277
27820/07/2006
279 Added LDAP Injection and Command Execution (eval, system, passthru...)
280
28111/07/2006
282 -r (--remove) option to remove parameters from URLs
283 Support for Basic HTTP Auth added but don't work with Python 2.4.
284 Proxy support.
285 Now use cookie files (option "-c file" or "--cookie file")
286 -u (--underline) option to highlight vulnerable parameter in URL
287 Detect more vulnerabilities.
288
28904/07/2006:
290 Now attacks scripts using QUERY_STRING as a parameter
291 (i.e. http://server/script?attackme)
292
29323/06/2006:
294 Version 1.0.1
295 Can now use cookies !! (use -c var=data or --cookie var=data)
296 Two utilities added : getcookie.py (interactive) and cookie.py (command line) to get a cookie.
297 Now on Sourceforge
298
29925/04/2006:
300 Version 1.0.0
30103/01/2018
302 Release of Wapiti 3.0.0
303
30423/12/2017
305 lswww is now renamed to Crawler.
306 All HTML parsing is now made with BeautifulSoup. lxml should be the parsing engine but it's possible to opt-out at
307 setup with --html5lib.
308 Analysis on JS in event handlers (onblur, onclick, etc)
309 Changed behavior ot 'page' scope, added 'url' scope.
310 Default mime type used for upload fields is image/gif.
311 Added yaswf as a dependency for SWF parsing.
312 Custom HTTP error codes check.
313 Fixed a bug with 'button' input types.
314 Updated pynarcissus with a python3 version for js parsing.
315 Rewrote "in scope" check.
316
31729/12/2009
318 Version 2.3.1
319 Fixed a bug in lswww if root url is not given complete.
320 Fixed a bug in lswww with a call to BeautifulSoup made on non text files.
321 Fixed a bug that occured when verbosity = 2. Unicode error on stderr.
322
32327/12/2009
324 Version 2.3.0
325 Internationalization and translation to english and spanish when called from
326 Wapiti.
327 Ability to save a scan session and restore it later (-i)
328 Added option -b to set the scope of the scan based on the root url given as
329 argument.
330 Fixed bug ID 2779441 "Python Version 2.5 required?"
331 Use an home made cookie library instead or urllib2's one.
332 Keep aditionnal informations on the webpages (headers + encoding)
333 Use BeautifulSoup to detect webpage encoding and handle parsing errors.
334 Fixed a bug when "a href" or "form action" have an empty string as value.
335 Better support of Unicode.
336
33726/03/2009
338 Version 2.2.0
339 Fixed bug ID 2433127 with HTTP 404 error codes.
340 Don't let httplib2 manage HTTP redirections : return the status code
341 and let lswww handle the new url.
342
34325/03/2009
344 Version 2.1.9
345 Added option -e (or --export)
346 Saves urls and forms data to a XML file.
347 We hope other fuzzers will allow importation of this file.
348
34924/03/2009
350 More verifications on timeout errors.
351
35222/03/2009
353 Version 2.1.8
354 Fixed bug ID: 2415094
355 Check on protocol found in hyperlinks was case-sentitive.
356 Moved it to non-case-sensitive.
357 Integration of a second linkParser class called linkParser2 from
358 lswwwv2.py. This parser use only regexp to extract links and forms.
359
36025/11/2008
361 httplib2 use lowercase names for the HTTP headers in opposition to
362 urllib2 (first letter was uppercase).
363 Changed the verifications on headers.
364
36515/11/2008
366 Fixed a bug with links going to parrent directory.
367
36802/11/2008
369 Better integration of proxy support provided by httplib2.
370 It's now possible to use SOCKS proxies.
371
37219/10/2008
373 Version 2.1.7
374 Now use httplib2 (http://code.google.com/p/httplib2/)n MIT licence
375 instead of urllib2.
376 The ability to use persistents connections makes the scan faster.
377
37809/10/2008
379 Version 2.1.6
380 HTTP authentification now works
381 Added the option -n (or --nice) to prevent endless loops during scanning
382
38328/01/2007
384 Version 2.1.5
385 First take a look at the Content-Type instead of the document extension
386 Added BeautifulSoup as an optionnal module to correct bad html documents
387 (better use tidy if you can)
388
38924/10/2006
390 Version 2.1.4
391 Wildcard exclusion with -x (--exclude) option
392
39322/10/2006
394 Fixed an error with url parameters handling that appeared in precedent
395 version.
396 Fixed a typo in lswww.py (setAuthCreddentials : one 'd' is enough)
397
39807/10/2006
399 Version 2.1.3
400 Three verbose mode with -v (--verbose) option
401 0: print only results
402 1: print dots for each page accessed (default mode)
403 2: print each found url durring scan
404 Timeout in seconds can be set with -t (--timeout) option
405 Fixed bug "crash when no content-type is returned"
406 Fixed an error with 404 webpages
407 Fixed a bug when the only parameter of an url is a forbidden one
408
40909/08/2006
410 Version 2.1.2
411 Fixed a bug with regular expressions
412
41305/08/2006
414 Version 2.1.1
415 Remove redundant slashes from urls
416 (e.g. http://server/dir//page.php converted to
417 http://server/dir/page.php)
418
41920/07/2006
420 Version 2.1.0 with urllib2
421
42211/07/2006
423 -r (--remove) option to remove parameters from URLs
424 Generate URL with GET forms instead of using POST by default
425 Support for Basic HTTP Auth added but don't work with Python 2.4.
426 Now use cookie files (option "-c file" or "--cookie file")
427 Extracts links from Location header fields
428
429
43006/07/2006
431 Extract links from "Location:" headers (HTTP 301 and 302)
432 Default type for "input" elements is set to "text"
433 (as written in the HTML 4.0 specifications)
434 Added "search" in input types (created for Safari browsers)
435
43604/07/2006
437 Fixed a bug with empty parameters tuples
438 (convert http://server/page?&a=2 to http://server/page?a=2)
439
44023/06/2006
441 Version 2.0.1
442 Take care of the "submit" type
443 No extra data sent when a page contains several forms
444 Corrected a bug with urls finishing by '?'
445 Support Cookies !!
446
44725/04/2006
448 Version 2.0
449 Extraction des formulaires sous la forme d'une liste de tuples
450 contenant chacun un string (url du script cible) et un dict
451 contenant les noms des champs et leur valeur par d�faut (ou 'true'
452 si vide)
453 Recense les scripts gerant l'upload
454 Peut maintenant fonctionner comme module
455
45619/04/2006
457 Version 1.1
458 Lecture des tags insensible a la casse
459 Gestion du Ctrl+C pour interrompre proprement le programme
460 Extraction des urls dans les balises form (action)
461
46212/10/2005
463 Version 1.0
464 Gestion des liens syntaxiquement valides mais pointant
465 vers des ressources inexistantes (404)
466
46711/09/2005
468 Beta4
469 Utilisation du module getopt qui permet de specifier
470 facilement les urls a visiter en premier, les urls a
471 exclure (nouveau !) ou encore le proxy a utiliser
472
47324/08/2005
474 Beta3
475 Ajout d'un timeout pour la lecture des pages pour ne pas
476 bloquer sur un script bugge
477
47823/08/2005
479 Version beta2
480 Prise en charge des indexs generes par Apache
481 Filtre sur les protocoles
482 Gestion des liens qui remontent l'arborescence
483 Gestion des liens vides
484
48502/08/2005
486 Sortie de la beta1
diff --git a/PKGBUILD b/PKGBUILD
index 6004eda..79b378f 100644
--- a/PKGBUILD
+++ b/PKGBUILD
@@ -1,18 +1,39 @@
1# Maintainer: mickael9 <mickael9 at gmail dot com> 1# Maintainer : Kr1ss $(echo \<kr1ss+x-yandex+com\>|sed s/\+/./g\;s/\-/@/)
2# Contributor : mickael9 <mickael9 at gmail dot com>
3
2 4
3pkgname=wapiti 5pkgname=wapiti
4pkgver=3.0.1 6
7pkgver=3.0.2
5pkgrel=1 8pkgrel=1
6pkgdesc="A vulnerability scanner for web applications. It currently search vulnerabilities like XSS, SQL and XPath injections, file inclusions, command execution, LDAP injections, CRLF injections..." 9
10pkgdesc='A comprehensive web app vulnerability scanner written in Python'
11arch=('any')
7url='http://wapiti.sourceforge.net/' 12url='http://wapiti.sourceforge.net/'
8license=(GPL) 13license=('GPL')
9depends=(python python-setuptools python-requests python-beautifulsoup4 python-lxml python-tld python-yaswfp python-mako python-pysocks) 14
10arch=(any) 15depends=('python-requests' 'python-beautifulsoup4' 'python-lxml' 'python-tld'
16 'python-yaswfp' 'python-mako' 'python-pysocks')
17optdepends=('python-requests-kerberos: Kerberos authentication'
18 'python-requests-ntlm: NTLM authentication')
19makedepends=('python-setuptools')
20
21options=('zipman')
11 22
12source=("http://downloads.sourceforge.net/sourceforge/${pkgname}/${pkgname}/${pkgname}-${pkgver}/${pkgname}${pkgver:0:1}-${pkgver}.tar.gz") 23changelog=ChangeLog
13sha256sums=('bbb8c8f572afe77319734489a6ca0b211df4b87ad294db79b8bf0bda1c5aff29') 24source=("http://downloads.sourceforge.net/sourceforge/$pkgname/$pkgname/$pkgname-$pkgver/$pkgname${pkgver:0:1}-$pkgver.tar.gz")
25sha256sums=('df86cab9f66c7794cab54fede16029056a764f5da565b2695524f9bd2bc9a384')
26
27
28build() {
29 cd "$pkgname${pkgver:0:1}-$pkgver"
30 python setup.py build
31}
14 32
15package() { 33package() {
16 cd "${srcdir}/${pkgname}${pkgver:0:1}-${pkgver}" 34 cd "$pkgname${pkgver:0:1}-$pkgver"
17 python setup.py install --root="${pkgdir}/" --optimize=1 35 python setup.py install --root="$pkgdir" --optimize=1 --skip-build
18} 36}
37
38
39# vim: ts=2 sw=2 et ft=PKGBUILD: