Now to fetch this data in Google Spreadsheet, use the following script:
/////////////////////////////////////////////////
function extract(url){
var response = UrlFetchApp.fetch(url);
var data = JSON.parse(response.getContentText());
var comm = data.comm;
var keyVal = function(o) {
var key = Object.keys(o)[0];
return [parseInt(key),parseInt(o[key])];
} ;
var result = new Array();
var count=0;
for (i in comm){
result[count] = new Array();
result[count] = keyVal(comm[count]);
count++;
}
return result;
};
/////////////////////////////////////////////////
Have a look at the following screenshot:
I have the URL in cell A1, and the following formula in cell B1:
=transpose(extract(A1))
Kishan, that's a really elegant solution.
ReplyDeleteThe first line represents a date (3 years history) and is standard for all the urls being scraped. How would you manipulate the code to only include the 2nd line (the data)?
Given a list of URLs in A, what would you do to the code to run through all the URLs?
Assuming that google or the target site don't like me going after so much data at once, is there a way to stagger the import?
I have updated the code in your spreadsheet.
Delete