Main Content

Automating access to a page obscured behind a holding page

Archive - Originally posted on "The Horse's Mouth" - 2009-09-23 04:34:47 - Graham Ellis

Question: "I have a web page that I visit which sends me an initial response to say that it's working on the results, and then the results appear a few seconds later. I want to use an automed process / program on my computer to visit this page and store the final results, rather than having to access it through the browser. How can I do this?"

What an excellent question - we've all visited pages that come back and say 'I'm working on those results' then 'here are you final results' a few seconds later, and wouldn't it be great to be able to call them up automatically to script the accesses to the page - that way you could make good use of some excellent resources.

The answer comes in two stages. Firstly - you need to understand which of several ways the server is sending the intermediate pages. It could be sending out a single multipart response, with a pause between each of the sections (that's server push). It could be sending out a series of pages, each one as a separate request from the client (that's client pull), or it could be sending out an initial page that includes Javascript which reads further data back from the server via an AJAX type connection. The particular example that I was asked to look at was client pull, and I wrote a short server side script to illustrate that. You can run it here and see the source code (it's in PHP) here. The source code is full commented!

Secondly (and really that is second - you must understand how your server is working before you try to script the front of it!), use a module such as Perl's LWP::UserAgent to automate the access to it. I've written a full example here which scripts the connection to the service that I've written as a demonstration in the paragraph above ... when I run it, the results I get are as follows:

-bash-3.2$ perl coojar
Intermediate: Running ... 0 seconds gone
Intermediate: Running ... 5 seconds gone
Intermediate: Running ... 10 seconds gone
Intermediate: Running ... 15 seconds gone
Intermediate: Completed ...
04:13:51 up 3 days, 12:18, 1 user, load average: 0.18, 0.13, 0.16
 
FINAL RESULT: Completed ...
04:13:51 up 3 days, 12:18, 1 user, load average: 0.18, 0.13, 0.16
 
-bash-3.2$


Once again, the code is fully commented - so have a look at the source to see how it works. One thing of note is the concept of a cookie jar - this is the object in memory in which all the cookies are stored. We have chosen fot this particular application to have our own independent set of cookies rather than to share the ones that another browser has on the same client machine, but there are other Perl modules which let you connect to / share the cookie files (cookie jars) or other browsers. The setting of the agent name in the program is no more that a user politeness in most cases; please note that if you're runningthis against our server, you do need to specify an agent name, as we have additional security traps in place and defaulted LWP clients - intentionally - don't work with this script.

LWP::UserAgent is a very powerful automated browsing module and there are other examples in our trainee / source code directory ... including:

Simple LWP browser for testing
Grabbing current currency exchange rates
Validating all links from a page
Checking a static site for changes

Do remember whenever you use LWP::UserAgent that you are writing an automata, and you should check the robots.txt file at the server to ensure that your activities are acceptable to the server's operator. See here and here for more about robots.txt.