I have 2 Windows 2019 servers. One on the internet and the other in a lab environment.
BOTH have Godaddy signed SSL certs.
BOTH can be accessed over Https with Edge, Chrome and Firefox.
I have the most simple code.
YET when trying to access the lab server I get the dreaded
urllib.error.URLError: <urlopen error [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: unable to get local issuer certificate (_ssl.c:992)>
I am using Pycharm 2022.3.2 and Python 3.11.2
The workstation running pycharm is Windows 10.
Both the server and the workstation are VirtualBox VMs. The live box is a physical server and as mentioned running server 2019.
This is driving me nuts.
I have tried SOOO many things…
Any help would be greatly appreciated.
Thanks
Tom
Sorry, i should have been more clear. I can browse to BOTH servers via Edge, Chrome and Firefox SECURELY. ALL browsers show SECURE connections.
I have done nothing outside of install python and pycharm in the workstation.
I would have no need to modify CA Stores. If multiple browsers and access the sites without modification why would i have to modify the CA?
Thanks for the early reply.
Maybe an intermediate certificate is missing in the CA store.
Browsers apparently use “AIA chasing” in that case. Python apparently does not.
You can check the certificate chain in your browser. E.g. for this site it’s discuss.python.org → R3 → ISRG Root X1. So if R3 is missing in your CA Store, browsers don’t mind, but Python does.
That is unless the server is configured to also send the intermediate certificate. So maybe only one of your servers does that. You can confirm that using e.g. a test tool like this which presumably shows them under “Additional Certificates (if supplied)”.
Hi Peter,
The 2 servers are identical configs. Less the SSLs. Since the Lab machine is private the link you sent will not work.
in your statement
You can check the certificate chain in your browser. E.g. for this site it’s discuss.python.org → R3 → ISRG Root X1 . So if R3 is missing in your CA Store, browsers don’t mind, but Python does.
Are you saying that my script should produce the error for this site. It does not.
import urllib.request
page = urllib.request.urlopen(“https://discuss.python.org”).read()
print (page)
This works fine.
R3 is NOT in my CA store. ISRG IS and so if godaddy ( where I received the cert from)
I find it very hard to believe that a platform as well developed and supported as Python was difficulties decoding SSL. Lots of people claim to have the answer. Its to import into python or my script keys that I have to harvest from existing keys. What is the technical limitation with Python that makes this so difficult?
I got this issue recently and got a issue to fix it.
It typically indicates a problem with the SSL/TLS certificate validation process when trying to establish a secure connection to a server.
The system may not have the necessary CA certificates to validate the server’s certificate. Ensure that your system’s CA certificate store is up to date and includes the CA that issued the server’s certificate.
This link gave me a lot of ideas to fix my issue. How to Fix Unable to get Local Issuer Certificate - howtouselinux
No, but it’s VERY likely that with something as complicated as SSL, there are things that people can get wrong - and it’s also extremely likely that browsers will permit this and still show the page. Hence advice like checking for the intermediate certificates.