Web scraper save response

Hi - newbie here, I am trying to gather simple data from multiple URLs I can run the scrape and works well but sometimes I need 1,000’s of responses so can’t copy and paste the responses into excel like normal as the window only shows about 1500 responses.

Here is the scrape I am using and you can see the data is very simple text which I can convert in excel.

import requests

for i in range(1,10):
URL = “https://gateway.pinata.cloud/ipfs/QmUJrnabRCMLsnvXNryojLWcysc4WwJCLqWYvJcWADfZFo/chadsJSON/” + str(i) + “.json”
page = requests.get(URL)
print(page.text)

Can anyone help me imporve this so all responses get saved to a text or csv file?

Thanks

If all you want is page.text saved to a text file, its very simple to do. For each response, you just open a file, write the text and you’re done. For example,

# Open a new text file for writing with standard UTF-8 encoding
with open(f"response_{i}.txt", mode="w", encoding="utf-8") as resp_file:
    resp_file.write(page.text)  # Write the response text to the file

(Note that I used an f-string for formatting the name, f"response_{i}.txt", which is equivalent to "response_" + str(i) + ".txt", but shorter, easier to read, more performant and harder to introduce bugs with).

The official Python tutorial offers a basic guide to reading and writing files. It also discusses how to write JSON, a simple, widely used, easy to work with and machine-readable way to store structured data.

In fact, your download data is already formatted as JSON, so you can swap the extension above from .txt to .json and use Python’s built in json module if you need to read it back in, or if you just want to work it directly from the page object without saving it (e.g. select certain key(s) to save to a file), use page.json() to interpret your data as JSON, which should spit out a regular Python dictionary that’s easier to work with than raw text.

If you want to save it as CSV (which you can open in Python, Excel, your text editor or a wide variety of programs), I’m guessing you want to convert the JSON keys to columns and with the contents of each JSON being one row? You can read and write CSVs using Python’s included csv module, or (recommended), you can just read a list of dictionaries (i.e. a list of the output of page.json() for each page) directly into a pandas.DataFrame(), if you don’t mind using pandas, manipulate it as you like in Python, and output that to CSV with df.to_csv(), or any other format you choose.