I've created a web crawler using Java and Playframework 1.2.3. Now, i'd like to crawl some webpages protected by a classic login/password form.
In fact, it's like doing this play test :
@Test
public void someTestOfASecuredAction() {
Map<String, String> loginUserParams = new HashMap<String, String>();
loginUserParams.put("username", "admin");
loginUserParams.put("password", "admin");
Response loginResponse = POST("/login", loginUserParams);
Request request = newRequest();
request.cookies = loginResponse.cookies; // this makes the request authenticated
request.url = "/some/secured/action";
request.method = "POST";
request.params.put("someparam", "somevalue");
Response response = makeRequest(request);
assertIsOk(response); // Passes!
}
But not with the site generated by play, but with an external website.
So, i manage to use play web server to do that :
Map<String,String> params = new HashMap<String, String>();
params.put( "telecom_username",
Play.configuration.getProperty("telecom.access.user") );
params.put( "telecom_password",
Play.configuration.getProperty("telecom.access.pass") );
HttpResponse response = WS.url(loginUrl)
.setParameters(params)
.followRedirects(true)
.post();
When i'm doing that, if i look in response.getString(), i found the redirection page where cookies are set before continuing, but then, if i get a protected page, i'm still not log in. It's like the cookies were never set, and the HttpResponse object does not have any cookies related function, like the response in previous test code.
I've also tried the authenticate() method on ws.url() but it doesn't work either.
I don't really know if what i'm trying to do is possible by using play web server, but i could use an help on this ^^
Thanks a lot !