From: UnixOS2 Archive To: "UnixOS2 Archive" Date: Tue, 16 Jul 2002 04:32:24 EST-10EDT,10,-1,0,7200,3,-1,0,7200,3600 Subject: [UnixOS2_Archive] No. 274 ************************************************** Monday 15 July 2002 Number 274 ************************************************** Subjects for today 1 Re: novice orientation : Maynard" 2 Re: cc : lordspigol" 3 Re: cc : lordspigol" 4 Re: Perl 5.8.0 RC3 : Maynard" 5 Re: novice orientation : Thomas Dickey 6 novice orientation : lordspigol" 7 Re: novice orientation : Maynard" 8 Re: XCOPY using WGET : Maynard" 9 Re: cc : John Poltorak 10 Re: XCOPY using WGET : lordspigol" 11 Re: novice orientation : lordspigol" 12 Re: Baseline toolset RC1 : lordspigol" 13 Re: XCOPY using WGET : Maynard" 14 XCOPY using WGET : John Poltorak 15 Re: XCOPY using WGET : illya at vaeshiep.demon.nl 16 WGET & CURL : lordspigol" 17 Re: XCOPY using WGET : lordspigol" 18 Re: XCOPY using WGET : illya at vaeshiep.demon.nl 19 Re: XCOPY using WGET : Dave Saville" 20 UnixOS2 feel : lordspigol" 21 Re: XCOPY using WGET : csaba.raduly at sophos.com 22 Re: novice orientation : csaba.raduly at sophos.com 23 Re: XCOPY using WGET : lordspigol" 24 Re: XCOPY using WGET : DoC" 25 Re: XCOPY using WGET : lordspigol" 26 Re: Ogg Vorbis goes gold : Brian Havard" 27 Re: Baseline toolset RC1 : John Poltorak 28 Re: XCOPY using WGET : lordspigol" 29 Re: Baseline toolset RC1 : lordspigol" 30 Re: Baseline toolset RC1 : Sebastian Wittmeier (ShadoW)" 31 Re: XCOPY using WGET : John Poltorak 32 Re: XCOPY using WGET : John Poltorak 33 Re: Baseline toolset RC1 : Sebastian Wittmeier (ShadoW)" 34 Re: novice orientation : John Poltorak 35 Re: Perl 5.8.0 RC3 : John Poltorak 36 Re: Perl 5.8.0 RC3 : Sebastian Wittmeier (ShadoW)" 37 Re: XCOPY using WGET : Sebastian Wittmeier (ShadoW)" 38 Re: Perl 5.8.0 RC3 : Sebastian Wittmeier (ShadoW)" 39 Re: XCOPY using WGET : Sebastian Wittmeier (ShadoW)" 40 Re: cc : Stefan Neis 41 Re: WGET & CURL : csaba.raduly at sophos.com 42 Re: XCOPY using WGET : Steve Wendt 43 Re: Baseline toolset RC1 : Kris Steenhaut 44 Re: Baseline toolset RC1 : Stefan Neis 45 Re: Baseline toolset RC1 : Andreas Buening 46 Re: XCOPY using WGET : Illya Vaes 47 Re: WGET & CURL : Sebastian Wittmeier (ShadoW)" **= Email 1 ==========================** Date: Tue, 16 Jul 2002 01:55:26 -0500 (CDT) From: "Maynard" Subject: Re: novice orientation John, >> For debugging I prefer "one step at a time", so I thought I'd sit at >> the shell prompt and issue instructions as found in the .sh file; which >> would be a lot less awkward than modifying the .sh file between runs >> and re-executing the .cmd file. >I would guess that running the shell script for building Perl, one step at >a time would take weeks... No; interactive shell command sequence same as in build_perl.sh: Configure, make, make test, make install, perl harness -- Maynard **= Email 2 ==========================** Date: Tue, 16 Jul 2002 02:46:52 -0300 (ADT) From: "lordspigol" Subject: Re: cc Hmm, I understand. Backward compatibility again. Again, another doubt: the DLL knows what executable called it? I can any command of fgrep on egrep or something so? []s Rod On Mon, 15 Jul 2002 19:24:08 +0100, csaba.raduly at sophos.com wrote: > >Hyst^H^H^Historical reasons. In the beginning there was grep. Then there >was fgrep. Then there was egrep. These were separate programs. > >The GNU folks' implementation of [ef]?grep does it with a single >executable. They just tweak the regexp package behaviour an then execute >the same code. In order to accommodate people/scripts who still expect any >one of the mentioned executables, they provide all three. > >It is probably a peculiarity of the OS/2 port to provide three executables >and a DLL to do the real job. >(on a Linux system, these three are ALL symlinks to /bin/egrep) > **= Email 3 ==========================** Date: Tue, 16 Jul 2002 02:56:04 -0300 (ADT) From: "lordspigol" Subject: Re: cc Unfortunately, it is impossible on HFPS. :( There is a FS that supports linking on the hobbes, but I dont feel that it is safe. Someone have some idea? 4OS2 allow do hack of a link with the alias list loaded by 4start.cmd. With 4os2 as default shell very softwares understand the alias & act rightly. It breaks totally if tried on UnixOS2 project, I guess. But the power of Unix of create a link of a text file is awesome. Well, someone cant win always. :( []s Rod On Mon, 15 Jul 2002 19:27:48 +0100, John Poltorak wrote: >On Unix egrep and fgrep are symbolically linked to grep, but we don't have >the luxury of doing this on OS/2. **= Email 4 ==========================** Date: Tue, 16 Jul 2002 03:02:53 -0500 (CDT) From: "Maynard" Subject: Re: Perl 5.8.0 RC3 Rod, >I guess the problem of lack of a little readme saying what packages >& versions of these packages are needed, it will happen again. as it happens, the current issue of os/2 ezine is relevant http://www.os2ezine.com/20020716/page_7.html `~Maynard **= Email 5 ==========================** Date: Tue, 16 Jul 2002 06:27:04 -0400 From: Thomas Dickey Subject: Re: novice orientation On Tue, Jul 16, 2002 at 07:13:59AM -0300, lordspigol wrote: > sh and bash are the same thing? no - generally bash implements everything in sh compatibly enough to run scripts designed for Bourne shell. -- Thomas E. Dickey http://invisible-island.net ftp://invisible-island.net **= Email 6 ==========================** Date: Tue, 16 Jul 2002 07:13:59 -0300 (ADT) From: "lordspigol" Subject: novice orientation sh and bash are the same thing? **= Email 7 ==========================** Date: Tue, 16 Jul 2002 08:26:48 -0500 (CDT) From: "Maynard" Subject: Re: novice orientation John, >So, what is stopping you? > >You only need to ensure you have the correct environment set up then start >sh and run Configure. What's the problem? The problem is ensuring the correct environment ;-} My starting point includes no development tools (other than your baseline and the os2 linker), emx runtime in path and libpath, some gnu tools in path which seem distinctly different from those in your baseline. What I'm slowly learning that I need to create is an environment which probably excludes emx_runtime, adds baseline to path and libpath, and probably some environment variables like LIBRARY_PATH and who knows what else. From my situation there is much more to learn than to forget ;-} Thanks for your patience, -- Maynard **= Email 8 ==========================** Date: Tue, 16 Jul 2002 08:58:22 -0500 (CDT) From: "Maynard" Subject: Re: XCOPY using WGET John, >I've actually managed to achieve what I wanted using:- > >wget -Ncr -nH --cut-dirs=3 -P TARGET URL From my recent close examination of wget, what you have is as good as it gets. -nH removes the hostname of source from target --cut-dirs removes the specified number of directories in the source path These two are only relevant if using recursion via -r I've only tested these with ftp; http could probably require something additional. See also -nr, --dont-remove-listing don't remove `.listing' files. -N provides the timestamp checking. I used this against your baseline directory to verify that I had everything which you did. It looks like a good mirroring system. All of my .cmd files for the baseline had this intent for checking updates. The current wget does a nice job of maintaining a connection for multiple files in a list; a feature I hadn't used before. Curl is favorably mentioned, but so far wget is not deficient for my uses, so I haven't looked into curl. Later, -- Maynard **= Email 9 ==========================** Date: Tue, 16 Jul 2002 09:37:31 +0100 From: John Poltorak Subject: Re: cc On Tue, Jul 16, 2002 at 02:56:04AM -0300, lordspigol wrote: > Unfortunately, it is impossible on HFPS. :( AIUI, there is some work being done to allow symbolic links to work on HPFS and JFS... > []s > Rod > > On Mon, 15 Jul 2002 19:27:48 +0100, John Poltorak wrote: > > >On Unix egrep and fgrep are symbolically linked to grep, but we don't have > >the luxury of doing this on OS/2. -- John **= Email 10 ==========================** Date: Tue, 16 Jul 2002 10:20:32 -0300 (ADT) From: "lordspigol" Subject: Re: XCOPY using WGET Why FAT would be need? HPFS is the default in OS2 & a marvel. :) Rod On Tue, 16 Jul 2002 12:54:58 GMT, illya at vaeshiep.demon.nl wrote: >>Check out the --no-host-dirs option which loses the "unix.os2site.com" >>directory (i.e. the one corresponding to the host name). >>>Needless to say, it won't work on FAT... >>Unless the remote files and directories all conform to 8.3, >>like many files on Hobbes do, for example. > >Assuming one knows about --no-host-dirs (I didn't, offhand) and uses it too. >The directory name "hobbes.nmsu.edu" doesn't conform to 8.3 and verrry few websites have >only one dot in them >;-\ > > **= Email 11 ==========================** Date: Tue, 16 Jul 2002 10:51:59 -0300 (ADT) From: "lordspigol" Subject: Re: novice orientation Same for me. My somewhat experience with Pascal means nothing with UnixOS2. :( []s Rod On Tue, 16 Jul 2002 08:26:48 -0500 (CDT), Maynard wrote: >From my situation there is much more to learn than to forget ;-} **= Email 12 ==========================** Date: Tue, 16 Jul 2002 10:54:53 -0300 (ADT) From: "lordspigol" Subject: Re: Baseline toolset RC1 MED is the right editor for UnixOS2. My luck is I love this editor. :) Rod On Tue, 16 Jul 2002 14:12:54 +0100, John Poltorak wrote: >This is a REXX test which ought to work, but I don't understand what it is >trying to, although the error msg looks like a line termination problem. >ie. if you edit a REXX program and save it with Unix line termination you >get similar errors when trying to run it. **= Email 13 ==========================** Date: Tue, 16 Jul 2002 10:57:37 -0500 (CDT) From: "Maynard" Subject: Re: XCOPY using WGET On Tue, 16 Jul 2002 12:01:17 -0300 (ADT), lordspigol wrote: >Where in the world there is the TAR/2?! [x:\unixos2\baseline]grep tar toolset.lst http://hobbes.nmsu.edu/pub/os2/util/archiver/gtar258.zip `~Maynard **= Email 14 ==========================** Date: Tue, 16 Jul 2002 11:13:32 +0100 From: John Poltorak Subject: XCOPY using WGET I'm trying to get WGET to function as XCOPY between TCP/IP hosts, but can't figure out the correct flags. Can anyone help me out? I want to do something like:- xcopy host1:path1 host2:path2 /s -- John **= Email 15 ==========================** Date: Tue, 16 Jul 2002 11:16:05 GMT From: illya at vaeshiep.demon.nl Subject: Re: XCOPY using WGET >>I'm trying to get WGET to function as XCOPY between TCP/IP hosts, but >>can't figure out the correct flags. Can anyone help me out? >>I want to do something like:- >>xcopy host1:path1 host2:path2 /s >To the best of my knowledge it is possible only with >a ftp client that is able to server to server transfers. wget -r or wget --recursive but it puts the copied stuff below the current directory and has the drawback of putting it in a directory structure consisting of "host1\path1". Example: wget -r http://unix.os2site.com/some/dir/deep/down/ will put the retrieved stuff in/below unix.os2site.com\some\dir\deep\down . Needless to say, it won't work on FAT... **= Email 16 ==========================** Date: Tue, 16 Jul 2002 11:32:13 -0300 (ADT) From: "lordspigol" Subject: WGET & CURL I take a look on CURL somedays ago. It is very beyond of wget. Wget is a downloader. CURL seems to be a sort of hacking tool that can be a downloader also. Tested the software of Brian Havard also. All in all, the only problem is that I think maybe dont exists a software that can download the same way that the browser can do. Some sites need the browser to do the download. It is possible to hack someway to cheat the site to think that the browser is downloading when it is in the reality something command line mode? []s Rod On Tue, 16 Jul 2002 08:58:22 -0500 (CDT), Maynard wrote: >Curl is favorably mentioned, but so far wget is not deficient for my >uses, so I haven't looked into curl. **= Email 17 ==========================** Date: Tue, 16 Jul 2002 12:01:17 -0300 (ADT) From: "lordspigol" Subject: Re: XCOPY using WGET Where in the world there is the TAR/2?! []s Rod On Tue, 16 Jul 2002 14:53:02 +0100, John Poltorak wrote: >I was looking for something like this many years ago, and fortunately >saved it as one of my prize snippets:- > >tar cppf - d:/os2/. | tar xfpp - > > >I haven't used it in a while though... **= Email 18 ==========================** Date: Tue, 16 Jul 2002 12:54:58 GMT From: illya at vaeshiep.demon.nl Subject: Re: XCOPY using WGET >Check out the --no-host-dirs option which loses the "unix.os2site.com" >directory (i.e. the one corresponding to the host name). >>Needless to say, it won't work on FAT... >Unless the remote files and directories all conform to 8.3, >like many files on Hobbes do, for example. Assuming one knows about --no-host-dirs (I didn't, offhand) and uses it too. The directory name "hobbes.nmsu.edu" doesn't conform to 8.3 and verrry few websites have only one dot in them ;-\ **= Email 19 ==========================** Date: Tue, 16 Jul 2002 12:57:23 +0100 (BST) From: "Dave Saville" Subject: Re: XCOPY using WGET On Tue, 16 Jul 2002 11:13:32 +0100, John Poltorak wrote: > >I'm trying to get WGET to function as XCOPY between TCP/IP hosts, but >can't figure out the correct flags. Can anyone help me out? > >I want to do something like:- > >xcopy host1:path1 host2:path2 /s John If you can mount the drives in question I always use tar piped to tar. OS/2 tar has a change directory flag so its something like cd to where you want to start tar -cvf - | tar - D somewhereelse -xf - From memory that -D is not the correct switch but you get the general idea. on *nix one would do tar -cvf - | (cd somewhereelse;tar -xf -) but OS/2 does not understand that syntax for command lines - even with the correct command separator which I can never remember :-) -- Regards Dave Saville Please note new email address dave.saville at ntlworld.com **= Email 20 ==========================** Date: Tue, 16 Jul 2002 12:58:33 -0300 (ADT) From: "lordspigol" Subject: UnixOS2 feel Folks, because of the high tech level of the members of the list I am not in temporary membership anymore. =) Even I learning what I need (compile some security softwares, specially anti-sniffing) I learned to love this special environment that we have in this list. cheers to all & take care, Rod **= Email 21 ==========================** Date: Tue, 16 Jul 2002 13:01:58 +0100 From: csaba.raduly at sophos.com Subject: Re: XCOPY using WGET On 16/07/2002 12:16:05 owner-os2-unix wrote: >>>I'm trying to get WGET to function as XCOPY between TCP/IP hosts, but >>>can't figure out the correct flags. Can anyone help me out? >>>I want to do something like:- >>>xcopy host1:path1 host2:path2 /s >>To the best of my knowledge it is possible only with >>a ftp client that is able to server to server transfers. > >wget -r >or >wget --recursive >but it puts the copied stuff below the current directory Newer versions have the --perfix=PREFIXDIR option >and has the drawback of putting >it in a directory structure consisting of "host1\path1". >Example: wget -r http://unix.os2site.com/some/dir/deep/down/ >will put the retrieved stuff in/below unix.os2site.com\some\dir\deep\down . Check out the --no-host-dirs option which loses the "unix.os2site.com" directory (i.e. the one corresponding to the host name). There's also --cut-dirs=NUMBER, which discards a number of "levels" from the beginning (--cut-dirs=2 would save the downloaded files to deep\down). You might need the --no-parent switch to prevent it from ascending above the required directory (usually not a problem with FTP, but if you go through a HTTP proxy, you'll get an HTML page from it which is likely to have a link to the parent). >Needless to say, it won't work on >FAT... > Unless the remote files and directories all conform to 8.3, like many files on Hobbes do, for example. -- Csaba Ráduly, Software Engineer Sophos Anti-Virus email: csaba.raduly at sophos.com http://www.sophos.com US Support: +1 888 SOPHOS 9 UK Support: +44 1235 559933 **= Email 22 ==========================** Date: Tue, 16 Jul 2002 13:04:32 +0100 From: csaba.raduly at sophos.com Subject: Re: novice orientation On 16/07/2002 11:27:04 owner-os2-unix wrote: >On Tue, Jul 16, 2002 at 07:13:59AM -0300, lordspigol wrote: >> sh and bash are the same thing? > >no - generally bash implements everything in sh compatibly enough to run >scripts designed for Bourne shell. > OS/2 ports of both bash and pdksh supply a sh.exe too. In some cases this is a "cut-down" version, i.e. with only the features of the original Bourne shell. -- Csaba Ráduly, Software Engineer Sophos Anti-Virus email: csaba.raduly at sophos.com http://www.sophos.com US Support: +1 888 SOPHOS 9 UK Support: +44 1235 559933 **= Email 23 ==========================** Date: Tue, 16 Jul 2002 13:12:08 -0300 (ADT) From: "lordspigol" Subject: Re: XCOPY using WGET Perfectly! What I felt the lack was of TAR.EXE. I am rebuilding my environment. A little problem with LVM in OS2's install. Compress can be good sometime of that also. GZIP was one the first things I installed from hobbes. :) But the TAR was a little more troublesome. God, hobbes has a incredibly rich set of utils! []s Rod On Tue, 16 Jul 2002 17:14:15 +0200 (CEST), Sebastian Wittmeier (ShadoW) wrote: >gtar258.zip from hobbes? **= Email 24 ==========================** Date: Tue, 16 Jul 2002 13:26:59 -0300 (EST) From: "DoC" Subject: Re: XCOPY using WGET On Tue, 16 Jul 2002 11:16:05 GMT, illya at vaeshiep.demon.nl wrote: >wget -r >or >wget --recursive >but it puts the copied stuff below the current directory and has the drawback of putting >it in a directory structure consisting of "host1\path1". >Example: wget -r http://unix.os2site.com/some/dir/deep/down/ >will put the retrieved stuff in/below unix.os2site.com\some\dir\deep\down . >Needless to say, it won't work on >FAT... > I'd also use "-np" so it doesn't climb down the directories and download the whole host... Actually you can disable the creation of the host directory (www.xxxx.com/) with the option "-nH". There are other options that might also help, like "--cut-dirs=X", where X is the number of directories from the root that won't be created. For example: wget -r -nH --cut-dirs=2 http://www.blah.com/bleh/blih/index.html would save the files to the current directory instead of creating the dir structure "www.blah.com\bleh\blih". -- DoC **= Email 25 ==========================** Date: Tue, 16 Jul 2002 13:27:35 -0300 (ADT) From: "lordspigol" Subject: Re: XCOPY using WGET Skies! TAR already was here & I didnt knew! D:\unixos2\213.152.37.92\pub\unixos2\baseline\archives\gtar258.zip What a domain can hide... :) Rod On Tue, 16 Jul 2002 10:57:37 -0500 (CDT), Maynard wrote: >>Where in the world there is the TAR/2?! > >[x:\unixos2\baseline]grep tar toolset.lst >http://hobbes.nmsu.edu/pub/os2/util/archiver/gtar258.zip **= Email 26 ==========================** Date: Tue, 16 Jul 2002 13:45:21 +1000 (EST) From: "Brian Havard" Subject: Re: Ogg Vorbis goes gold On Mon, 15 Jul 2002 17:32:18 +0100, John Poltorak wrote: >I saw a note on TheREG which said Ogg Vorbis had gone gold. > >Wondered if the OS/2 port would be availble soon... The Register's article was premature, it's not quite official yet. I've got an OS/2 port ready to go once it is though. -- ______________________________________________________________________________ | Brian Havard | "He is not the messiah! | | brianh at kheldar.apana.org.au | He's a very naughty boy!" - Life of Brian | ------------------------------------------------------------------------------ **= Email 27 ==========================** Date: Tue, 16 Jul 2002 14:12:54 +0100 From: John Poltorak Subject: Re: Baseline toolset RC1 On Tue, Jul 16, 2002 at 03:02:12PM +0200, Sebastian Wittmeier (ShadoW) wrote: > After putting rxu.dll and vxrexx.dll into the libpath the results got > worse: Does it actually pick them up from LIBPATH? Are they also on the PATH? Last time I looked at the tests, they were flawed since they tried to pick up LIBPATH from the environment. > lib/rx_cmprt.t 255 65280 18 3 16.67% 16-18 This is a REXX test which ought to work, but I don't understand what it is trying to, although the error msg looks like a line termination problem. ie. if you edit a REXX program and save it with Unix line termination you get similar errors when trying to run it. > (but it skips less than your build) I think I must have messed up my build somehow. I'll try again from scratch. > Sebastian -- John **= Email 28 ==========================** Date: Tue, 16 Jul 2002 14:14:21 -0300 (ADT) From: "lordspigol" Subject: Re: XCOPY using WGET Powerful is the magic of wget. =) Awesome util. Rod On Tue, 16 Jul 2002 13:26:59 -0300 (EST), DoC wrote: >wget -r -nH --cut-dirs=2 http://www.blah.com/bleh/blih/index.html > >would save the files to the current directory instead of creating the dir structure "www.blah.com\bleh\blih". **= Email 29 ==========================** Date: Tue, 16 Jul 2002 14:25:14 -0300 (ADT) From: "lordspigol" Subject: Re: Baseline toolset RC1 Thanks, Kris! =) Take a look at F File Manager: http://filemanager.free.fr/index.htm It is cake of power. But success is not guaranteed. :( Even copy of a file is trouble, but the file viewer is Unix-VI compliant, have Laplink-mode file transfer, ftpd (yez!!). Resuming: a cake of power. It is on my list of softwares to learn. cheers, Rod On Tue, 16 Jul 2002 18:15:28 +0200, Kris Steenhaut wrote: >> MED is the right editor for UnixOS2. My luck is I love this editor. :) > >Seconded! I followed about 6 months ago your advice, and I'm glad I did. :-) **= Email 30 ==========================** Date: Tue, 16 Jul 2002 14:31:40 +0200 (CEST) From: "Sebastian Wittmeier (ShadoW)" Subject: Re: Baseline toolset RC1 On Tue, 16 Jul 2002 12:17:36 +0100, John Poltorak wrote: >Since RC3 is the latest could you give that a try sometime? You need to >make a slight change to makedef.pl for it to work. Failed 21/726 test scripts, 97.11% okay. 100/67968 subtests failed, 99.85% okay. Failed Test Stat Wstat Total Fail Failed List of Failed ------------------------------------------------------------------------ ------- ../ext/DB_File/t/db-btree.t 0 15 ?? ?? % ?? ../ext/DB_File/t/db-hash.t 0 15 ?? ?? % ?? ../ext/DB_File/t/db-recno.t 0 15 ?? ?? % ?? ../ext/IO/lib/IO/t/io_multihomed. 255 65280 8 8 100.00% 1-8 ../ext/IO/lib/IO/t/io_sock.t 255 65280 20 20 100.00% 1-20 ../ext/IO/lib/IO/t/io_udp.t 255 65280 7 7 100.00% 1-7 ../ext/Socket/Socket.t 29 7424 16 15 93.75% 2-16 ../lib/AnyDBM_File.t 0 15 12 12 100.00% 1-12 ../lib/ExtUtils/t/basic.t 1 256 17 1 5.88% 14 ../lib/Memoize/t/errors.t 0 15 11 8 72.73% 4-11 ../lib/Memoize/t/tie.t 0 15 ?? ?? % ?? ../lib/Net/Ping/t/110_icmp_inst.t 255 65280 2 1 50.00% 2 ../lib/Net/Ping/t/120_udp_inst.t 255 65280 2 1 50.00% 2 ../lib/Net/Ping/t/130_tcp_inst.t 255 65280 2 1 50.00% 2 ../lib/Net/Ping/t/140_stream_inst 255 65280 2 1 50.00% 2 ../lib/Net/hostent.t 6 1536 7 6 85.71% 2-7 ../lib/Net/t/hostname.t 2 1 50.00% 1 lib/os2_ea.t 21 8 38.10% 7-11 14-16 lib/os2_process.t 6 1536 227 6 2.64% 80 85 90 94 174 209 lib/rx_cmprt.t 255 65280 18 3 16.67% 16-18 run/fresh_perl.t 97 1 1.03% 91 66 tests and 556 subtests skipped. >> If only ramfs wouldn't trap with so many files ... > >Do you have the latest RAMFS. That's what I used and didn't have any >traps. Heh, you are already cheating! When I tried it again, RAMFS didn't trap, but make produced an error (couldn't fork anymore). Sebastian **= Email 31 ==========================** Date: Tue, 16 Jul 2002 14:53:02 +0100 From: John Poltorak Subject: Re: XCOPY using WGET On Tue, Jul 16, 2002 at 12:57:23PM +0100, Dave Saville wrote: > On Tue, 16 Jul 2002 11:13:32 +0100, John Poltorak wrote: > > > > >I'm trying to get WGET to function as XCOPY between TCP/IP hosts, but > >can't figure out the correct flags. Can anyone help me out? > > > >I want to do something like:- > > > >xcopy host1:path1 host2:path2 /s > > John > > If you can mount the drives in question I always use tar piped to > tar. I can't use NFS. > OS/2 tar has a change directory flag so its something like > > cd to where you want to start > > tar -cvf - | tar - D somewhereelse -xf - > > From memory that -D is not the correct switch but you get the general > idea. > > on *nix one would do > > tar -cvf - | (cd somewhereelse;tar -xf -) > > but OS/2 does not understand that syntax for command lines - even > with the correct command separator which I can never remember :-) I was looking for something like this many years ago, and fortunately saved it as one of my prize snippets:- tar cppf - d:/os2/. | tar xfpp - I haven't used it in a while though... > -- > Regards > > Dave Saville > Please note new email address dave.saville at ntlworld.com -- John **= Email 32 ==========================** Date: Tue, 16 Jul 2002 15:01:14 +0100 From: John Poltorak Subject: Re: XCOPY using WGET On Tue, Jul 16, 2002 at 01:01:58PM +0100, csaba.raduly at sophos.com wrote: > You might need the --no-parent switch to prevent it from > ascending above the required directory (usually not a problem > with FTP, but if you go through a HTTP proxy, you'll get an > HTML page from it which is likely to have a link to the parent). ISTR there being something required when going through a proxy, but I can't remember precisely what. This doesn't work properly through a proxy:- wget -Ncr -nH --cut-dirs=3 -P %bld_home% ftp://unixos2: at 213.152.37.92/pub/unixos2/build_system/ Is there anything I can add to make it work? I do have the proxy settings working correctly in WGETRC. -- John **= Email 33 ==========================** Date: Tue, 16 Jul 2002 15:02:12 +0200 (CEST) From: "Sebastian Wittmeier (ShadoW)" Subject: Re: Baseline toolset RC1 After putting rxu.dll and vxrexx.dll into the libpath the results got worse: Failed 23/726 test scripts, 96.83% okay. 235/68195 subtests failed, 99.66% okay. Failed Test Stat Wstat Total Fail Failed List of Failed ------------------------------------------------------------------------ ------- ../ext/DB_File/t/db-btree.t 0 15 ?? ?? % ?? ../ext/DB_File/t/db-hash.t 0 15 ?? ?? % ?? ../ext/DB_File/t/db-recno.t 0 15 ?? ?? % ?? ../ext/IO/lib/IO/t/io_multihomed. 255 65280 8 8 100.00% 1-8 ../ext/IO/lib/IO/t/io_sock.t 255 65280 20 20 100.00% 1-20 ../ext/IO/lib/IO/t/io_udp.t 255 65280 7 7 100.00% 1-7 ../ext/Socket/Socket.t 29 7424 16 15 93.75% 2-16 ../lib/AnyDBM_File.t 0 15 12 12 100.00% 1-12 ../lib/ExtUtils/t/Embed.t 9 9 100.00% 1-9 ../lib/ExtUtils/t/basic.t 1 256 17 1 5.88% 14 ../lib/Memoize/t/errors.t 0 15 11 8 72.73% 4-11 ../lib/Memoize/t/tie.t 0 15 ?? ?? % ?? ../lib/Net/Ping/t/110_icmp_inst.t 255 65280 2 1 50.00% 2 ../lib/Net/Ping/t/120_udp_inst.t 255 65280 2 1 50.00% 2 ../lib/Net/Ping/t/130_tcp_inst.t 255 65280 2 1 50.00% 2 ../lib/Net/Ping/t/140_stream_inst 255 65280 2 1 50.00% 2 ../lib/Net/hostent.t 6 1536 7 6 85.71% 2-7 ../lib/Net/t/hostname.t 2 1 50.00% 1 lib/os2_ea.t 21 8 38.10% 7-11 14-16 lib/os2_process.t 255 65280 227 126 55.51% 80 85 90 94 106-227 lib/os2_process_kid.t 227 6 2.64% 80 85 90 94 174 209 lib/rx_cmprt.t 255 65280 18 3 16.67% 16-18 run/fresh_perl.t 97 1 1.03% 91 65 tests and 544 subtests skipped. (but it skips less than your build) Sebastian **= Email 34 ==========================** Date: Tue, 16 Jul 2002 15:22:53 +0100 From: John Poltorak Subject: Re: novice orientation On Tue, Jul 16, 2002 at 08:26:48AM -0500, Maynard wrote: > John, > > >So, what is stopping you? > > > >You only need to ensure you have the correct environment set up then start > >sh and run Configure. What's the problem? > > The problem is ensuring the correct environment ;-} This is the whole point of putting together a baseline build. It is to ensure a uniform build environment. I would guess that no two OS/2 users have put their own Unix-like structure together in the same way, leading to very unpredictable results when other users try following any given set of instructions. > My starting point includes no development tools (other than your > baseline and the os2 linker), emx runtime in path and libpath, some gnu > tools in path which seem distinctly different from those in your > baseline. Explicitly setting PATH to the tools in the basline build should eliminate everything else. The major problem is LIBPATH, which I haven't managed to sort out yet. > What I'm slowly learning that I need to create is an environment which > probably excludes emx_runtime, adds baseline to path and libpath, and > probably some environment variables like LIBRARY_PATH and who knows > what else. Everything you need should be set up for you as part of the installation scripts. There's always a chance that something has been overlooked, but in such an event, let me know and I'll try and get it included. > -- > Maynard -- John **= Email 35 ==========================** Date: Tue, 16 Jul 2002 15:45:57 +0100 From: John Poltorak Subject: Re: Perl 5.8.0 RC3 On Tue, Jul 16, 2002 at 04:28:26PM +0200, Sebastian Wittmeier (ShadoW) wrote: > The following code snippet makes many tests fail: > > eval { require AnyDBM_File }; # not all places have dbm* functions > if ($ at ) { > print "ok\n"; > exit 0; > } > > Require exits if dbm is not supported, but it should print "ok". > I'm no Perl guru. So, who can fix it? If you have a c:\POPUPLOG.OS2, check to see if you have a number of entries like this:- ------------------------------------------------------------ 06-23-2002 12:28:33 SYS2070 PID 5842 TID 0001 Slot 00ba C:\EVAL\PERL-5.8\PERL-5.8.0-RC2\T\PERL.EXE DB_FILHA->PERLB12E.malloc 127 ------------------------------------------------------------ Every test which involves db.lib seems to fail. This is something to do with the redefinition of malloc since the release of Perl 5.7.3, but I've been unable to get it fixed for almost two months. I think there are only a couple of people who are capable of fixing this, but it's probably a time consuming task, although a quick fix may be possible if someone can make some sense of makedef.pl.. If you compare the latest, with the one in STABLE you may see a section which has been removed. We need to get that back somehow... > Sebastian -- John **= Email 36 ==========================** Date: Tue, 16 Jul 2002 16:28:26 +0200 (CEST) From: "Sebastian Wittmeier (ShadoW)" Subject: Re: Perl 5.8.0 RC3 The following code snippet makes many tests fail: eval { require AnyDBM_File }; # not all places have dbm* functions if ($ at ) { print "ok\n"; exit 0; } Require exits if dbm is not supported, but it should print "ok". I'm no Perl guru. So, who can fix it? Sebastian **= Email 37 ==========================** Date: Tue, 16 Jul 2002 16:33:54 +0200 (CEST) From: "Sebastian Wittmeier (ShadoW)" Subject: Re: XCOPY using WGET On Tue, 16 Jul 2002 15:01:14 +0100, John Poltorak wrote: >Is there anything I can add to make it work? -passive-ftp? **= Email 38 ==========================** Date: Tue, 16 Jul 2002 16:54:15 +0200 (CEST) From: "Sebastian Wittmeier (ShadoW)" Subject: Re: Perl 5.8.0 RC3 On Tue, 16 Jul 2002 16:28:26 +0200 (CEST), Sebastian Wittmeier (ShadoW) wrote: >The following code snippet makes many tests fail: > >eval { require AnyDBM_File }; # not all places have dbm* functions >if ($ at ) { > print "ok\n"; > exit 0; >} eval('require "AnyDBM_File"'); does work Sebastian **= Email 39 ==========================** Date: Tue, 16 Jul 2002 17:14:15 +0200 (CEST) From: "Sebastian Wittmeier (ShadoW)" Subject: Re: XCOPY using WGET On Tue, 16 Jul 2002 12:01:17 -0300 (ADT), lordspigol wrote: >Where in the world there is the TAR/2?! gtar258.zip from hobbes? Sebastian **= Email 40 ==========================** Date: Tue, 16 Jul 2002 18:04:33 +0200 (CEST) From: Stefan Neis Subject: Re: cc On Sun, 14 Jul 2002, John Poltorak wrote: > > I'm thinking of including cc.exe as a copy of gcc.exe in a standard > UnixOS/2 distro. IIRC, there are makefiles which do slightly different stuff, depending on whether they are using cc or gcc, so the above would result in some ugly problems.... Regards, Stefan -- Micro$oft is not an answer. It is a question. The answer is 'no'. **= Email 41 ==========================** Date: Tue, 16 Jul 2002 18:04:57 +0100 From: csaba.raduly at sophos.com Subject: Re: WGET & CURL On 16/07/2002 11:32:13 owner-os2-unix wrote: >I take a look on CURL somedays ago. It is very beyond of wget. >Wget is a downloader. CURL seems to be a sort of hacking tool >that can be a downloader also. cUrl has different aims. It has no recursive capabilities (that's left as an exercise :-). It can do other thing wget cannot (although wget is catching up, e.g. newest wget can do PUT). > >Tested the software of Brian Havard also. All in all, the only >problem is that I think maybe dont exists a software that can >download the same way that the browser can do. Some sites need >the browser to do the download. It is possible to hack someway >to cheat the site to think that the browser is downloading when >it is in the reality something command line mode? > The stupid way to do this is with User-Agent. All commandline downloaders have a switch to fake their User-Agent. The more insidious schemes rely on Referer and/or cookies. Wget can receive and send back cookies (cUrl too, almost certainly) Both wget and cUrl pass a Referer header. With these in place, most sites should be downloadable unless they employ some Javashi^H^Hcript nastiness. (In order to defeat Referer checks, you'll need to start the download with the web page and trust wget/cUrl to follow the appropriate link) -- Csaba Ráduly, Software Engineer Sophos Anti-Virus email: csaba.raduly at sophos.com http://www.sophos.com US Support: +1 888 SOPHOS 9 UK Support: +44 1235 559933 **= Email 42 ==========================** Date: Tue, 16 Jul 2002 18:08:44 -0700 (PDT) From: Steve Wendt Subject: Re: XCOPY using WGET On Tue, 16 Jul 2002, lordspigol wrote: > To the best of my knowledge it is possible only with > a ftp client that is able to server to server transfers. Otherwise known as FXP? > >I'm trying to get WGET to function as XCOPY between TCP/IP hosts, but > >can't figure out the correct flags. Can anyone help me out? Depending on what you're doing, rsync may be useful. **= Email 43 ==========================** Date: Tue, 16 Jul 2002 18:15:28 +0200 From: Kris Steenhaut Subject: Re: Baseline toolset RC1 lordspigol schreef: > MED is the right editor for UnixOS2. My luck is I love this editor. :) > Seconded! I followed about 6 months ago your advice, and I'm glad I did. :-) -- Groeten uit Gent, Kris **= Email 44 ==========================** Date: Tue, 16 Jul 2002 18:25:53 +0200 (CEST) From: Stefan Neis Subject: Re: Baseline toolset RC1 Hi, Thanks for assembling that list of files, I'm just installing a completely new box ... One thing that looks like a _big_ problem is gcc-3 & make. Is there any chance to get both working at the same time? gcc-3 comes with gettext-0.10.40 and make-3.79 says it won't work with precisely that version. Any suggestions how to work around that? Back to make-3.75 yet again? Regards, Stefan -- Micro$oft is not an answer. It is a question. The answer is 'no'. **= Email 45 ==========================** Date: Tue, 16 Jul 2002 19:57:46 +0200 From: Andreas Buening Subject: Re: Baseline toolset RC1 Stefan Neis wrote: > > Hi, > > Thanks for assembling that list of files, I'm just installing a > completely new box ... > > One thing that looks like a _big_ problem is gcc-3 & make. > Is there any chance to get both working at the same time? > gcc-3 comes with gettext-0.10.40 and make-3.79 says it won't work > with precisely that version. Any suggestions how to work around > that? Back to make-3.75 yet again? You can a) link gettext statically to make b) compile make without gettext at all c) solve the problem and tell the gcc guys that they should use a non defective dll Btw. what happens if you replace gcc's intl.dll by another file? bye, Andreas -- One OS to rule them all, One OS to find them, One OS to bring them all and in the darkness bind them In the Land of Redmond where the Shadows lie. **= Email 46 ==========================** Date: Tue, 16 Jul 2002 20:24:54 GMT+1 From: Illya Vaes Subject: Re: XCOPY using WGET ** Reply to note from John Poltorak Tue, 16 Jul 2002 15:01:14 +0100 >ISTR there being something required when going through a proxy, but I >can't remember precisely what. >This doesn't work properly through a proxy:- >wget -Ncr -nH --cut-dirs=3 -P %bld_home% ftp://unixos2: at 213.152.37.92/pub/unixos2/build_system/ >Is there anything I can add to make it work? >I do have the proxy settings working correctly in WGETRC I don't know WGETRC really (have used environment variables and arguments so far). Does it include the settings for "http_proxy" and "ftp_proxy" (environment variables, in terms of "http://proxy.domain:8080/"), "--proxy=on", "--proxy-user=" and "--proxy-passwd="? (When you need to log in to the remote sit, you can use --http-user and --http-passwd in the same way) -- Illya Vaes (illya at vaeshiep.demon.nl) "Do...or do not, there is no 'try'" - Yoda **= Email 47 ==========================** Date: Tue, 16 Jul 2002 22:11:05 +0200 (CEST) From: "Sebastian Wittmeier (ShadoW)" Subject: Re: WGET & CURL On Tue, 16 Jul 2002 18:04:57 +0100, csaba.raduly at sophos.com wrote: >Both wget and cUrl pass a Referer header. With these in place, >most sites should be downloadable unless they employ some Javashi^H^Hcript >nastiness. (In order to defeat Referer checks, you'll need to start the >download with the web page and trust wget/cUrl to follow the appropriate >link) An important "tool" for that purpose is Lynx. It can display detailed information about what is sent and received. Sebastian