Webscraping and copying data into google sheets

Hi, I intend to webscrape the data from a local stock exchange which has stock data and changes every two days.I want to use the python script to copy the stock price data table named ‘All’ from website https://rsebl.org.bt/#/equity-market into my google sheets.I followed the methods ie, get google API, copy data and etc but is not working.Please help.This is my script.
import requests
from bs4 import BeautifulSoup
import gspread
from oauth2client.service_account import ServiceAccountCredentials
import schedule
import time

Function to scrape data and update Google Sheets

def scrape_and_update():
# URL of the website
url = ‘https://rsebl.org.bt/#/equity-market

# Send a GET request to the website
response = requests.get(url)
response.raise_for_status()  # Check if the request was successful

# Parse the HTML content
soup = BeautifulSoup(response.content, 'html.parser')

# Find the table (you might need to adjust the selector based on the actual HTML structure)
table = soup.find('table')

# Extract table data
data = []
for row in table.find_all('tr'):
    cols = row.find_all('td')
    cols = [col.text.strip() for col in cols]
    data.append(cols)

# Google Sheets API setup
scope = ["https://spreadsheets.google.com/feeds", "https://www.googleapis.com/auth/drive"]
creds = ServiceAccountCredentials.from_json_keyfile_name('"C:\Users\DELL\AppData\Local\Programs\Python\XXXXX.json"', scope)
client = gspread.authorize(creds)

# Open the Google Sheets file and select the sheet
sheet = client.open('Shares Watchlist').worksheet('Sheet1')

# Clear the existing data
sheet.clear()

# Update the sheet with new data
for i, row in enumerate(data):
    sheet.insert_row(row, i + 1)

Schedule the task to run every minute for testing purposes

schedule.every(1).minutes.do(scrape_and_update)

Keep the script running

while True:
schedule.run_pending()
time.sleep(1)

Schedule the task to run every Tuesday and Friday

#schedule.every().tuesday.at(“10:00”).do(scrape_and_update)
#schedule.every().friday.at(“10:00”).do(scrape_and_update)

Keep the script running

#while True:
#schedule.run_pending()
#time.sleep(1)

What is the error message that it gives you when you run this program?

Also, was this a script you wrote or was it generated by ChatGPT?

Hi,any help or guidance on this method or workflow will be highly appreciated.A simple scraping of stock data is hard for none hard coders and charges for the same is unaffordable :((…
Tried to get online help but none could give detailed workflow.With kind regards,

Hi,for non coders,its hard to get by and fees for small table scraping for stock data is unaffordable…looking for online DIY type coding but was not successful.Any help or guidance towards this data scraping would be appreciated.With kind regards