There seems to be a GroupDocs’ backend account/storage mapping issue using python. I had similar problems testing with Aspose (and finally gave up). I can authenticate, but I always get a 401 for uploads. I’ve tried with “Default”, the GUID, and the actual storage name, but nothing works. My app is listed in the dashboard and is enabled. I always get:
requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url:
Hi @brchandl , can you please confirm the issue still persists? Can you please share a code fragment that cause the error?
Hi @sergei.terentev - Thanks for your response. Here’s a snippet of code below. As I mentioned, I’ve tried “Default”, the GUID storage name, and the name I gave to the storage (and I’m using the GroupDocs internal storage to simplify it rather than my Amazon S3 bucket), yet nothing works. I had almost exactly the same issue with Aspose a couple of weeks ago, including trying to tie in my Amazon S3 bucket storage, and after troubleshooting for several hours, I finally gave up.
…
# — ENV VARS —
GROUPDOCS_STORAGE_NAME = os.getenv(“GROUPDOCS_STORAGE_NAME”)
GROUPDOCS_ID = os.getenv(“GROUPDOCS_ID”)
GROUPDOCS_SECRET = os.getenv(“GROUPDOCS_SECRET”)
if not (GROUPDOCS_STORAGE_NAME and GROUPDOCS_ID and GROUPDOCS_SECRET):
raise RuntimeError(“Missing GroupDocs Cloud credentials in environment variables.”)
API_BASE = "https://api.groupdocs.cloud/v1.0"
TOKEN_URL = "https://api.groupdocs.cloud/connect/token"
def get_token():
resp = requests.post(
TOKEN_URL,
data={
"grant_type": "client_credentials",
"client_id": GROUPDOCS_ID,
"client_secret": GROUPDOCS_SECRET
}
)
resp.raise_for_status()
return resp.json()["access_token"]
token = get_token()
headers = {"Authorization": f"Bearer {token}"}
filename = os.path.basename(file_path)
upload_url = f"{API_BASE}/words/storage/file/{filename}"
with open(file_path, "rb") as f:
resp = requests.put(upload_url, data=f, headers=headers)
resp.raise_for_status()
get_text_url = f"{API_BASE}/words/{filename}/textItems?folder="
resp = requests.get(get_text_url, headers=headers)
resp.raise_for_status()
items = resp.json().get("textItems", {}).get("list", [])
if not items:
raise RuntimeError("No extractable text found in DOCX.")
original_texts = [item["text"] for item in items]
…
replace_url = f"{API_BASE}/words/{filename}/replaceText?folder="
replace_payload = {“replaceTextRequestList”: replacements}
resp = requests.post(replace_url, headers={**headers, “Content-Type”: “application/json”}, json=replace_payload)
resp.raise_for_status()
translated_filename = filename
out_url = f"{API_BASE}/words/storage/file/{translated_filename}"
resp = requests.get(out_url, headers=headers, stream=True)
resp.raise_for_status()
out_path = file_path + ".translated.docx"
with open(out_path, "wb") as out_f:
for chunk in resp.iter_content(chunk_size=8192):
if chunk:
out_f.write(chunk)
return out_path
Hi, @brchandl, looking on your response and source code, I have some questions:
- Why don’t you use GroupDocs.Conversion Cloud SDK for Python? Or SDKs for Python for other GroupDocs and Aspose Cloud APIs. You may check the examples of using it:
GitHub - groupdocs-conversion-cloud/groupdocs-conversion-cloud-python-samples: GroupDocs.Conversion Cloud SDK for Python examples, plugins and showcase projects
Right here is a code that uploads files into storage:
groupdocs-conversion-cloud-python-samples/Examples/Common.py at master · groupdocs-conversion-cloud/groupdocs-conversion-cloud-python-samples · GitHub - I see API_BASE = “https://api.groupdocs.cloud/v1.0” and upload_url = f"{API_BASE}/words/storage/file/{filename}" but GroupDocs Conversion Cloud has different storage URLs:
Api Base Url should be: “https://api.groupdocs.cloud” and file upload url, for example, is:
{API_BASE}/v2.0/conversion/storage/file/{path}". The Api version for GroupDocs Conversion is /2.0.
Here is API reference for all methods in GroupDocs Conversion Cloud:
GroupDocs.Conversion for Cloud - API References
Also you can find some useful code examples at Documentation