shell

Shell - SWiK #editGoo { display:none; } .editFormHolder { display:none; } SWiK » tagged pages » logout login / register (0 changes) Your tags: or Cancel New Page Wiki RSS Feed Blog Bookmarks Enter RSS URL to Syndicate here: (Feed found, click Add Page to syndicate.) Error finding feed, please try again » Find feed title A Blog Page allows you to add entries, for news or other time sensitive postings A Bookmarks page is a really simple list of links - ordered by date. Enter a title for this page this list of links the feed this blog Formatting help (Login required to save to your tagged pages.) Add to your tagged pages? Enter space separated tags: (or Cancel) Wiki Page Open Source Project Finding project information from the web - complete. Preview Formatting Preview your edits: Make further edits, (or Cancel) Homepage Author License (Login required to save to your tagged pages.) Add tags to save to your tagged pages: (space separated tags) Edit Summary (optional) (or Cancel) (Editing anonymously: to be credited for your changes, login or register a new account) Move Shell? Moving this page will change its URL and content tagged 'Shell' will not appear on the new page. Enter new Page Name Optional: Reason for moving this page or Cancel Change Page Permissions? Changing these permissions will adjust who can modify this page. Owner: Anonymous (change) Owner’s Access: Read & Write Read Only Owning Group: (change) Group’s Access: Read & Write Read Only Everyone’s Access: Read & Write Read Only (or Cancel) → Upload an image from your computer: → or Copy an image from a URL: → or Erase the current icon: Confirm: reset icon to default Icon Preview: Optional: Change comment or Cancel Erase Shell? The contents of Shell page and all pages directly attached to Shell will be erased. Optional: Reason for erasing this page or Cancel (Editing anonymously: to be credited for your changes, login or register a new account) other page actions: ... Move Page Change Page Icon Erase Page Project Page Wiki Page Shell Tags Applied to shell No one has tagged this page. Tag shell | Tagging Details Shell Wiki Pages Add new Wiki Page Tag Cloud To further filter what appears in the Things Tagged Shell list, select a tag from the Tag Cloud. bash cli code commandline console desktop Development eclipse Emacs Firefox free freeware gnu haskell HOWTO Java JavaScript linux mozilla opensource osx PHP plugin Programming python reference script Scripting Software ssh sysadmin terminal Tips tool tools Tutorial Ubuntu Unix utilities Windows See Large Tag Cloud Recent Edits for Shell Subscribe RSS What is Shell? Edit this page and describe it here. Edit This sorted by: recent | see : popular Content Tagged Shell Netcat, la navaja suiza de TCP/IP | CRySoL Wednesday, April 23, 2008 Netcat: del.icio.us tag/netcat linux Unix network shell net nc redes Do not close stderr Tuesday, April 22, 2008 A few years ago, I wrote a post commenting on how ugly this was: $ someprog 2>/dev/null I was nearly imploring the reader to close stderr: $ someprog 2>&- Some very knowledgeable anonymous commenter explained why that was a bad idea. At the time, I didn’t understand exactly what they were saying. As such, I deleted the post. Yesterday, for no particular reason, the implications of closing stderr popped into my head. In the shower no less. I wrote a simple little C program named do-not-close-stderr.c. It takes two parameters, a string you want written to a file and the file you want said string written to. After opening the file, it prints “some kind of warning message” to stderr. Here we are: $ gcc -Wall do-not-close-stderr.c -o do-not-close-stderr $ ./do-not-close-stderr "Brock was here." output Some kind of warning message. $ cat output Brock was here. Now lets close standard error when executing: $ ./do-not-close-stderr "Brock was here." output 2>&- $ cat output Some kind of warning message. Brock was here. Thanks to whoever that commenter was. Unix: BASH Cures Cancer Blog Unix shell todo practices Examples not good Dean Edwards: MiniWeb Monday, April 21, 2008 models an entire web site in a single HTML page. All of the site files are stored in a JSON object which you can navigate with a UNIX-like shell or the system browser. It has a built json: del.icio.us/tag/json Web WiKI JavaScript shell JSON Review webdev JSSh - a TCP/IP JavaScript Shell Server for Mozilla Monday, April 21, 2008 Firefox: del.icio.us/tag/firefox Development testing JavaScript Firefox mozilla shell A brief look at manipulating text in Linux Monday, April 21, 2008 Kellan-Elliot-Mcrea: del.icio.us/kellan Data Unix Text shell cat split Kellan-Elliot-Mcrea JSSh - a TCP/IP JavaScript Shell Server for Mozilla Monday, April 21, 2008 JSSh is a Mozilla C++ extension module that allows other programs (such as telnet) to establish JavaScript shell connections to a running Mozilla process via TCP/IP. This functionality is useful for interactive debugging/development of Mozilla application XUL: del.icio.us/tag/XUL Development Programming testing JavaScript Firefox mozilla shell Julius Plenz - Little Reverse Shell Guide Monday, April 21, 2008 Netcat: del.icio.us tag/netcat Security shell pentest Tutorial HOWTO Guide Hacking Netcat, la navaja suiza de TCP/IP | CRySoL Monday, April 21, 2008 Netcat: del.icio.us tag/netcat linux Unix network shell nc redes tutoriales prepend to a file with sponge from moreutils Thursday, April 17, 2008 A few weeks I wrote about a tool, which helps you easily prepend to a file. I submitted prepend to moreutils and Joey was kind enough to point out this could be done with `sponge’.В  sponge reads standard input and when done, writes it to a file: Probably the most general purpose tool in moreutils so far is sponge(1), which lets you do things like this: % sed "s/root/toor/" /etc/passwd | grep -v joey | sponge /etc/passwd Two days ago Joey released version 0.29 of moreutils including a patch by yours truly (with much help from Joey). sponge: Handle large data sizes by using a temp file rather than byВ  consuming arbitrary amounts of memory. Patch by Brock Noland. version 0.29 changelog Also, on a non-command line note, I found a video on Joey’s site which I thought was pretty cool, Joey Learns to Fly. Unix: BASH Cures Cancer Blog Unix shell tools patches contributions moreutils prepend igor.moochnick - Pash Monday, April 14, 2008 opensource: del.icio.us tag/opensource linux osx shell opensource pash igor.moochnick - Pash (open source PowerShell for Windows and "others") Monday, April 14, 2008 PowerShell open source reimplementation for "others" (Mac, Linux, Solaris, etc...) and Windows (including Windows Mobile and Windows CE) | About the name: Pash = Posh (PowerShell) + bash(one of the Unix shells) open-source: del.icio.us tag/open-source Software tool shell bash open-source it admin using kill to see if a process is alive Wednesday, April 09, 2008 I am making some changes to the moreutils sponge command. Sponge provides a method of prepending which is less specialized than my prepend util. However, it has trouble with large amounts of input. Regardless, while testing my changes, I want to watch it operate. Normally, you would just do so from a second terminal. That is a pain. kill -0 can be very useful for this. After backgrounding the command, I assign the pid (via the variable $!) to $pid using eval. eval is needed to stop BASH from expanding $! until after the background operation. After that, I enter a while loop on kill -0 $pid, which will not kill $pid, but will return successfully until $pid has died: # cat large-file-GB | ./sponge large-file-GB-copy & eval 'pid=$!'; while kill -0 $pid; do sleep 10; ls -lh large-file* /tmp/sponge.*; echo;done [1] 7937 -rw-r--r-- 1 root root 977M 2008-04-09 16:18 large-file-GB -rw------- 1 root root 128M 2008-04-09 17:23 /tmp/sponge.JMsBWG -rw-r--r-- 1 root root 977M 2008-04-09 16:18 large-file-GB -rw------- 1 root root 384M 2008-04-09 17:23 /tmp/sponge.JMsBWG -rw-r--r-- 1 root root 977M 2008-04-09 16:18 large-file-GB -rw------- 1 root root 877M 2008-04-09 17:24 /tmp/sponge.JMsBWG -rw-r--r-- 1 root root 977M 2008-04-09 16:18 large-file-GB -rw-r--r-- 1 root root 20M 2008-04-09 17:24 large-file-GB-copy -rw------- 1 root root 896M 2008-04-09 17:24 /tmp/sponge.JMsBWG -rw-r--r-- 1 root root 977M 2008-04-09 16:18 large-file-GB -rw-r--r-- 1 root root 413M 2008-04-09 17:25 large-file-GB-copy -rw------- 1 root root 896M 2008-04-09 17:24 /tmp/sponge.JMsBWG -rw-r--r-- 1 root root 977M 2008-04-09 16:18 large-file-GB -rw-r--r-- 1 root root 836M 2008-04-09 17:25 large-file-GB-copy -rw------- 1 root root 896M 2008-04-09 17:24 /tmp/sponge.JMsBWG -rw-r--r-- 1 root root 977M 2008-04-09 16:18 large-file-GB -rw-r--r-- 1 root root 920M 2008-04-09 17:25 large-file-GB-copy [1]+ Done cat large-file-GB | ./sponge large-file-GB-copy ls: cannot access /tmp/sponge.*: No such file or directory -rw-r--r-- 1 root root 977M 2008-04-09 16:18 large-file-GB -rw-r--r-- 1 root root 977M 2008-04-09 17:25 large-file-GB-copy -bash: kill: (7937) - No such process # md5sum large-file-GB* b5c667a723a10a3485a33263c4c2b978 large-file-GB b5c667a723a10a3485a33263c4c2b978 large-file-GB-copy Unix: BASH Cures Cancer Blog Unix script shell tools links PS -ef Performance testing - with curl Monday, April 07, 2008 Often I need or want to do some type of performance testing. Given my ideas on software development, I can usually do this by making simple HTTP requests. I use curl for this. While you may be tempted to do this in a for loop (or worse, actually write something!): $ time for i in {1..1000}; do curl -s "http://bashcurescancer.com/blank.html";done realВ В В  0m23.436s userВ В В  0m6.416s sysВ В В В  0m7.351s Curl provides the same functionality: $ time curl -s "http://bashcurescancer.com/blank.html?[1-1000]" realВ В В  0m6.561s userВ В В  0m0.294s sysВ В В В  0m0.494s Here are the details from the curl manual: The URL syntax is protocol dependent. You’ll find a detailed description in RFC 3986. You can specify multiple URLs or parts of URLs by writing part sets within braces as in: http://site.{one,two,three}.com or you can get sequences of alphanumeric series by using [ ] as in: ftp://ftp.numericals.com/file[1-100].txt ftp://ftp.numericals.com/file[001-100].txtВ В В  (with leading zeros) ftp://ftp.letters.com/file[a-z].txt No nesting of the sequences is supported at the moment, but you can use several ones next to each other: http://any.org/archive[1996-1999]/vol[1-4]/part{a,b,c}.html You can specify any amount of URLs on the command line. They will be fetched in a sequential manner in the specified order. Since curl 7.15.1 you can also specify step counter for the ranges, so that you can get every Nth number or letter: http://www.numericals.com/file[1-100:10].txt http://www.letters.com/file[a-z:2].txt If you specify URL without protocol:// prefix, curl will attempt to guess what protocol you might want. It will then default to HTTP but try other protocols based on often-used host name prefixes. For example, for host names starting with “ftp.” curl will assume you want toВ  speak FTP. CurlВ  willВ  attemptВ  to re-use connections for multiple file transfers, so that getting many files from the same server will not do multiple connects / handshakes. This improves speed. Of course this is only done on files specified on a singleВ  commandВ  lineВ  andВ  cannotВ  beВ  used between separate curl invokes. This is important as it helps measure the actual change being tested. A for loop, by creating a new process every loop, will fill up your test with “local” time. Using a single curl process eliminates this - which should allow you to see the results of your test in a more transparent manner. For example, lets say you have a change that reduces page production time. Your not sure how long, so you decide to run 1000 tests. Eliminating a second from a 23 second tests is not 5 percent. While removing a second from a 6 second test, is almost 20%. Unix: BASH Cures Cancer Blog Development Software Unix testing shell HTTP practice New command: prepend Sunday, April 06, 2008 I am utilizing Google’s project hosting to host software which I create and feel is useful or want to keep track of. I called the project Brock’s Tools. The code that led me to create this project was a command I am calling prepend 1.1. (UPDATE: See this post on sponge as its a better general case tool.) prepend, prepend’s files or standard input to a file. For example,В  you have three files: $ echo BROCK > a $ echo DAVID > b $ echo NOLAND > c And you want to combine them into one file: $ echo "My name is:" | prepend - a b c $ cat c My name is: BROCK DAVID NOLAND Or lets say you just want to append a file to itself: $ cat a BROCK $ cat a >> a cat: a: input file is output file prepend does this: $ prepend a $ cat a BROCK BROCK I come across the a situation where this would be useful quite often. Of course prepend’ing can be done in the shell: $ { echo "My name is:"; cat a b c; } > tmp && mv -f tmp c $ cat c My name is: BROCK DAVID NOLAND However, that is unsafe and I have lost data that way. I perform this operation most often when dealing with XML. In this example, its trivial to open the file in an editor, but with a large file, its quite nasty to do so: $ cat something.xml stuff 1 stuff 2 stuff 3 stuff 4 $ echo "" >> something.xml $ cat something.xml stuff 1 stuff 2 stuff 3 stuff 4 $ echo "" | prepend - something.xml $ cat something.xml stuff 1 stuff 2 stuff 3 stuff 4 Unix: BASH Cures Cancer Blog Unix python script shell tools brock's Shell Function - Which Webserver Does That Site Run? Friday, April 04, 2008 I just read the following post Python - Script - Which Webserver Does That Site Run? by blogger Corey Goldberg. I prefer the shell version: $ what-http-server() { curl -s -I "http://$1" | awk -F': ' '/^Server:/ {print $2}'; } $ what-http-server www.pylot.org Apache/2.0.52 $ what-http-server() { curl -s -I "$@" | awk -F': ' '/^Server:/ {print $2}'; } $ what-http-server www.pylot.org google.com bashcurescancer.com Apache/2.0.52 gws Apache/2.2.6 (Unix) That works but this version is more correct: what-http-server() { curl -s -I $(for h in "$@"; do printf "http://%s " "$h"; done) | awk -F': ' '/^Server:/ {print $2}'; } In the version which works for multiple hosts, we are letting curl assume the protocol is HTTP. This works fine most of the time. However, there are exceptions: If you specify URL without protocol:// prefix, curl will attempt to guess what protocol you might want. It will then default to HTTP but try other protocols based on often-used host name prefixes. For example, for host names starting with “ftp.” curl will assume you want toВ  speak FTP. - man curl Unix: BASH Cures Cancer Blog Unix python shell HTTP wget function curl Exposing command line programs as web services Thursday, March 27, 2008 The web services paradigm of development is based on the Unix philosophy of “small is good”.В  Web services should do one job, and do it well, allowing users to develop complex solutions by combining small, reliable and proven services. Why not then, expose the power of familiar Unix commands like sort, grep, gzip… to the web? Here is a proof of concept python script (Python 2.3 version) to demonstrate. Start services: $ ./to_web.py -p8008 sort & Thu Mar 27 13:45:54 2008 sort server started - 8008 $ ./to_web.py -p8009 gzip & Thu Mar 27 13:46:29 2008 gzip server started - 8009 Use the services: $ for i in {1..10}; do echo ${RANDOM:0:2}; done | \ > curl –data-binary @- “http://swat:8008/sort+-nr” | \ > curl –data-binary @- “http://swat:8009/gzip” | \ > gunzip 97 37 23 23 21 18 11 11 10 10 In my position, we have a database with host information - which has a command line interface. This tool has dependencies which are a painful to resolve. With to_web.py, we can turn the command line tool into a web service and access the data without having to satisfy those additional dependencies. This is guest post by my esteemed colleague Adam Fokken. He can be reached here: Sadly, he does not have a blog. Unix: BASH Cures Cancer Blog Unix python script shell tools WebService Ideas Page 1 | Next >> About SWiK Random Project Recent Edits All original text is available as Creative Commons Attribution-ShareAlike Privacy. Terms. Credits. π Your SWiK user name A new account Username: Password: Remember me on this computer Take credit for any changes I made as an anonymous user (or Cancel) Username: Email: (Optional… for lost password, etc.) Password: Verify: Remember me on this computer Take credit for any edits made as an anonymous user (or Cancel) разделы купля производственный комплекс органический растворитель ваза 2111 государственный герб флаг башня восстановление потенция слоеный изделие kiev apartments service создание лого деловой разведка измеритель фаза нуль сушильный машина frigidaire монетница 100 девчонка одна лифт ферромолибден крупный жилищный комплекс автошкола тонирование окон головка винторезный эрозия шейка матка рассылка база данный итальянский вина авиа отправка антенна бустер рукавица скс аэрография краска ржавчина дулевский фарфор стелажи профессиональный видеосъемка скачать короткий нард сделать пазл сушильный машина ardo сушильный машина asko скачать короткий нард купля производственный комплекс восстановление информация кофе колониальный товар профессиональный фарфор купить автотехнику вакансия красноярск добрый тепло вытяжка крона пионовая беседка купить каболка международный конкурс дебютант электроинструмент metabo dect desktop бензопила stihl магнитный доска продать кайт красный площадь мавзолей агат кристи билет мустанг лазер архитектурный визуализация штангенциркуль купить видеокарту доставка канцелярия огнестойкий краска лечение алкоголизма вымпел информационный валаам охота легавый сдать анализ кровь цепной конвейер фейрверк праздник mastercard ларсен центр бензопила dolmar электропечь dimplex model lee rc создание лого градирня вентиляторные грд вагонка половой доска защитный краска корпаративные вечеринка плазменный панель настенный корпоративный хранилище данный спирли фактурный краска слимент лифт shell