Creating a Web Text Scraper with Visual Basic

Introduction

Web scraping is a term that is becoming increasingly popular in the development world. It could because developers always tend to try to make things more and more convenient for users. At first, I wasn't a big fan of scraping because it can be used to obtain data not intended to be had by a user. Today you will create a program to scrape text from a website.

Web Scraping

Here is a nice definition of Web Scraping

Our Project

Open Visual Studio 2012, and create a VB.NET Windows Forms project. Name it anything you like and design it as shown in Figure 1.

Our Design
Figure 1 - Our Design

Coding

Not much coding. Be warned though, that doesn't mean that the code you will learn today will be easy. Nothing worthwhile is ever easy. Now, you're not here to learn about my life lessons, nor philosophy. Let's get the show on the road.

Add the following Imports above your Form's class definition:

Imports System.Text
Imports System.Net
Imports System.IO
Imports System.Text.RegularExpressions

You will use the System.Text and System.Text.RegularExpressions namespaces to manipulate the HTML tags of the webpage being read. HTML tags are in a specific format: HTML tags are usually enclosed within <> signs. Apart from the ordinary HTML tags, there might be certain script languages such as VBScript, JavaScript, ASP and PHP also involved in a website as well as CSS (Cascading Style Sheets - which aid in the formatting of web pages). We need to take all of these factors into consideration when dealing with web pages.

The System.Net namespace provides objects such as the WebResponse object that provides a certain response to a calling application, and the HttpWebRequest, which enables us to understand the HTTP text. HTTP stands for Hyper Text Transfer Protocol. This protocol deals solely with web pages. You also get other Protocols such as the FTP (File Transfer Protocol) but that is a topic for another day.

The System.IO namespace will give you the StreamReader object, which you can use to read any stream of data.

Let's put all of these technologies together. Add the following sub to your application:

    Private Sub Scrape()

        Try

            Dim strURL As String = "http://codeguru.com"

            Dim strOutput As String = ""

            Dim wrResponse As WebResponse
            Dim wrRequest As WebRequest = HttpWebRequest.Create(strURL)

            txtScrape.Text = "Extracting..." & Environment.NewLine

            wrResponse = wrRequest.GetResponse()

            Using sr As New StreamReader(wrResponse.GetResponseStream())
                strOutput = sr.ReadToEnd()
                ' Close and clean up the StreamReader
                sr.Close()
            End Using

            txtScrape.Text = strOutput

            'Formatting Techniques

            ' Remove Doctype ( HTML 5 )
            strOutput = Regex.Replace(strOutput, "<!(.|\s)*?>", "")

            ' Remove HTML Tags
            strOutput = Regex.Replace(strOutput, "</?[a-z][a-z0-9]*[^<>]*>", "")

            ' Remove HTML Comments
            strOutput = Regex.Replace(strOutput, "<!--(.|\s)*?-->", "")

            ' Remove Script Tags
            strOutput = Regex.Replace(strOutput, "<script.*?</script>", "", RegexOptions.Singleline Or RegexOptions.IgnoreCase)

            ' Remove Stylesheets
            strOutput = Regex.Replace(strOutput, "<style.*?</style>", "", RegexOptions.Singleline Or RegexOptions.IgnoreCase)

            txtFormatted.Text = strOutput 'write Formatted Output To Separate TB


        Catch ex As Exception

            Console.WriteLine(ex.Message, "Error")

        End Try

    End Sub

Let me break my logic down for you. I created a string object to hold the URL (Universal Resource Locator, in layman's terms it means a web address) from which I will be scraping text. In this case it is Codeguru.com. Next I created a WebResponse object and an HttpWebRequest object. The HttpWebRequest Object creates a request to the specified URL. After I create the request, I send it via the WebResponse object. This object returns the text sent from the HTTP protocol back to us. Now, the tricky part...

Once we have the text, we need to format it appropriately. This is where the Regular Expressions come in. If you haven't heard of Regular Expressions before, have a look through this article of mine. Regular Expressions makes it easy to return certain strings in an appropriate way. Here I also had to compensate for the HTML tags, HTML comments, possible Script and CSS Style tags.

To finish up, you need to call the Scrape sub. Add this code now:

    Private Sub btnExtract_Click(sender As Object, e As EventArgs) Handles btnExtract.Click

        Scrape() 'Scrape Text From URL

    End Sub

Conclusion

Very interesting stuff indeed! As you can see, it is very easy to scrape text from websites. All you need is a basic understanding of HTML and VB.NET. If you are interested in downloading images from websites, you can have a look here. Until next time, cheers!



Related Articles

Downloads

Comments

  • well done

    Posted by Tom on 06/24/2014 06:05am

    Ok, good job. However, it would be great to have web scraper that would work for booking sites. Example, you want to get a query from car hire website. You pass some query data and get the quote back. It looks a bit more complicated, but there must be a way to do that, as there are loads of similar applications that do this.

    Reply
Leave a Comment
  • Your email address will not be published. All fields are required.

Top White Papers and Webcasts

  • 10 Rules that Make or Break Enterprise App Development Projects In today's app-driven world, application development is a top priority. Even so, 68% of enterprise application delivery projects fail. Designing and building applications that pay for themselves and adapt to future needs is incredibly difficult. Executing one successful project is lucky, but making it a repeatable process and strategic advantage? That's where the money is. With help from our most experienced project leads and software engineers, …

  • The first phase of API management was about realizing the business value of APIs. This next wave of API management enables the hyper-connected enterprise to drive and scale their businesses as API models become more complex and sophisticated. Today, real world product launches begin with an API program and strategy in mind. This API-first approach to development will only continue to increase, driven by an increasingly interconnected web of devices, organizations, and people. To support this rapid growth, …

Most Popular Programming Stories

More for Developers

Latest Developer Headlines

RSS Feeds