[ previous ] [ Contents ] [ 1 ] [ 2 ] [ 3 ] [ 4 ] [ 5 ] [ 6 ] [ 7 ] [ 8 ] [ 9 ] [ 10 ] [ 11 ] [ next ]
firefox
instances
In order to execute firefox
(mozilla
or
iceweasel
) from a machine that is different to the local box while
running firefox
locally the program should be launched as:
firefox -no-remote
iceweasel
In order to sign documents of the Junta de AndalucĂa or
Universidad de Huelvawith firefox
(mozilla
or iceweasel
) the following steps should be accomplished. This
has been tested with the electronic administration website of the UHU
(www.uhu.es/ae)
First a certificate from the FNMT accrediting the identity of the person who is going to sign the document should be among the certificates available to the navigator. If it is not so, it should be imported (Preferences -> Advanced -> Encryption -> View Certificates -> Import).
Once this is done the package[6]sun-java6-plugin should be installed and the navigator should be restarted after its installation.
The files jss33.jar
and libjss3.so
have to be downloaded. A page is opened with the instructions when the
navigator detects its absence. They have to be copied with the appropriate
permissions to the right plugin directory
# chmod 644 jss33.jar # cp jss33.jar /usr/lib/jvm/java-6-sun/jre/lib/ext/ # chmod 755 libjss3.so # cp libjss3.so /usr/lib/jvm/java-6-sun/jre/lib/i386/client/
Once this is done the electronic administration page should allow us to fill the form and (hopefully) to sign it.
The original console web browser is lynx
. In order to avoid
questions inquiring whether you accept or not cookies it can be invoked using
the option
$ lynx -accept_all_cookies
Another interesting option is -dump. When using this option the
program writes the lynx output and writes it to the standard output. More
modern and flexible is the links
browser, that can be used in text
mode but also can be compiled to be accessed from a graphical display (option
-g). The -dump option is also available. A strictly
text replacement for lynx
(and more modern) is Elinks
with several interesting options.
In order to download an off line copy of some web pages, so you can browse them
later offline you can use curl
, that can upload or download data
from to or from a server. You can specify multiple URL's by ranges as in the
following two examples
$ curl http://site.(s1,s2,s3).org $ curl http://site.s[1-3].org
The copy of an entire site can be done using wget
$ wget -k -r -p http://www.interesting_site.org
The option -r recurses through the site links starting from http://www.interesting_site.org/index.html. The -k option make the links relative allowing the correct navigation through the downloaded pages. The -p option downloads all extra content on the page. This order makes a true mirror of a site in your computer.
Finally, the program wput uploads contents to internet using
FTP as an interface and with a syntax like the wget
one.
In order that files can be listed downloaded when accessing a directory on a
webserver the following line should be added to the .htaccess
file
Options +Indexes
Be warned that the contents of all subdirectories of the directory will also be listable and downloadable...
[ previous ] [ Contents ] [ 1 ] [ 2 ] [ 3 ] [ 4 ] [ 5 ] [ 6 ] [ 7 ] [ 8 ] [ 9 ] [ 10 ] [ 11 ] [ next ]
Some Mini-Howtos of Interest
Curro Perez-Bernalmailto:francisco.perez@dfaie.uhu.es