How can I get the content of a webpage

For a project, I need to get the HTML content of an external website. My first thought was to use GetRequest_Submit for this. However, I only get empty responses. I tried several things including Jquery but then you have other domain problems.

Does anyone have an idea how to approach this?


Nuno Reis wrote:

Have you tried the scraping component?

Thank you, that looks exactly what I am looking for!

Test it.

If it works for you, please mark my comment as a solution.

So, it not only scrapes but has a complete HTML dom solution to search and handle elements. Perfect, except that I need to scrap most of my code because this extension does it all