I need someone to visit my website and its 20 sub-domains and copy all of the text content and photos.
Because every page that you are going to copy is the exact same layout and fields, I am assuming this person will write a script program to go through my web pages and make a copy of the content (XML of XLS is great format) and also a copy of all of the photos in a unique folder, so then I can upload this into my new database.
I recently had someone try to use a scraper and the scraper did a horrible job and did not collect all of the data that I need asap.
I am a domain expert in data scraping and content extraction. Send me an url of your site and I will send you a demo of what I extracted. If you like it, we can proceed further.