<%@ page contentType="text/html;charset=gb2312"%>
<%
String sCurrentLine;
String sTotalString;
sCurrentLine="";
sTotalString="";
l_urlStream;
l_url = new ("http:///");
l_connection = () l_url.openConnection();
l_connection.connect();
l_urlStream = l_connection.getInputStream();
l_reader = new (new (l_urlStream));
while ((sCurrentLine = l_reader.readLine()) != null)
{
sTotalString+=sCurrentLine;
}
(sTotalString);
%>
Postscript
Although the code is relatively simple, I think that based on this, the function of "web crawler" can be implemented, such as finding the href connection from the page, then getting that connection, and then "catching" it without stopping (of course, you can limit the number of layers). In this way, the "web search" function can be implemented.
<%
String sCurrentLine;
String sTotalString;
sCurrentLine="";
sTotalString="";
l_urlStream;
l_url = new ("http:///");
l_connection = () l_url.openConnection();
l_connection.connect();
l_urlStream = l_connection.getInputStream();
l_reader = new (new (l_urlStream));
while ((sCurrentLine = l_reader.readLine()) != null)
{
sTotalString+=sCurrentLine;
}
(sTotalString);
%>
Postscript
Although the code is relatively simple, I think that based on this, the function of "web crawler" can be implemented, such as finding the href connection from the page, then getting that connection, and then "catching" it without stopping (of course, you can limit the number of layers). In this way, the "web search" function can be implemented.