Running a Python script manually every time it is needed works at first. It stops working when the script needs to run daily, when you are not at the machine, or when the output needs to be available somewhere other than a local folder.
GitHub Actions solves this without requiring a server, a cron job on a remote machine, or any DevOps infrastructure. A workflow file in the repository defines when and how the script runs. Everything else is handled by GitHub.
This guide covers setting up a Python script for automated execution, creating a workflow that triggers on push or a schedule, saving output files as downloadable artefacts, and optionally routing results to cloud storage or email.
What this covers:
Structuring a Python script and repository for GitHub Actions
Creating a workflow that triggers on push and on a schedule
Installing dependencies and running the script in the workflow
Saving output files as build artefacts
Optional: uploading results to Google Drive or sending via email
What Gets Built
By the end of this guide, a GitHub repository will:
Run a Python script automatically on every push to
mainRun the same script on a daily schedule using a cron expression
Install any required dependencies before each run
Save output files (CSVs, logs, reports) as downloadable artefacts from the Actions tab
Practical use cases this covers: daily web scraping, scheduled report generation, data cleaning pipelines, and any recurring task currently run by hand.
Step 1: Prepare the Python Script
The script needs to run cleanly from the command line with no interactive input. A minimal example that writes output to a file:
# main.py
import datetime
now = datetime.datetime.now()
print(f"Script executed at: {now}")
with open("output.txt", "w") as f:
f.write(f"Last run: {now}\n")
f.write("This file was generated by GitHub Actions.")
If the script has dependencies, list them in requirements.txt:
requests==2.31.0
pandas==2.1.0
Pinning versions in requirements.txt ensures the workflow installs exactly the same library versions every run, which prevents dependency updates from silently breaking the script.
Step 2: Push the Repository to GitHub
The repository structure before adding the workflow:
your-repo/
├── main.py
├── requirements.txt
└── .github/
└── workflows/
Create the .github/workflows/ directory now. The workflow file added in the next step goes there.
Step 3: Create the Workflow File
Create .github/workflows/run-python.yml with the following content:
name: Run Python Script
on:
push:
branches:
- main
schedule:
- cron: '0 8 * * *' # Runs every day at 8:00 AM UTC
jobs:
run-script:
runs-on: ubuntu-latest
steps:
- name: Checkout Code
uses: actions/checkout@v4
- name: Set Up Python
uses: actions/setup-python@v5
with:
python-version: '3.11'
- name: Install Dependencies
run: |
pip install --upgrade pip
pip install -r requirements.txt
- name: Run Python Script
run: python main.py
- name: Upload Output File
uses: actions/upload-artifact@v4
with:
name: script-output
path: output.txt
What this workflow does, step by step:
Triggers on every push to
mainand once daily at 08:00 UTCChecks out the repository code onto the runner
Installs Python 3.11
Upgrades pip and installs the dependencies from
requirements.txtRuns
main.pyUploads
output.txtas a named artefact available for download
The cron expression '0 8 * * *' means: minute 0, hour 8, every day of the month, every month, every day of the week. Adjusting the hour and minute values changes when it runs. GitHub uses UTC, so account for the offset from your local timezone.
Note: GitHub Actions does not guarantee exact cron timing under high load. Scheduled runs may be delayed by several minutes but will not be skipped.
Step 4: View and Download Artefacts
After the workflow runs:
Open the repository on GitHub and click the Actions tab
Select the workflow run from the list
Scroll to the Artefacts section at the bottom of the run summary
Click the artefact name to download it
GitHub retains artefacts for 90 days by default. For output that needs to persist longer or be accessible outside of GitHub, the optional steps below cover routing results to cloud storage or email.
Optional: Extend with Cloud Storage or Email
Upload Output to Google Drive
- name: Upload to Google Drive
uses: itrsgroup/drive-upload-action@v1
with:
client_id: ${{ secrets.DRIVE_CLIENT_ID }}
client_secret: ${{ secrets.DRIVE_CLIENT_SECRET }}
refresh_token: ${{ secrets.DRIVE_REFRESH_TOKEN }}
file_path: output.txt
Store the OAuth credentials in GitHub Secrets under repository Settings. The DRIVE_REFRESH_TOKEN requires completing a Google OAuth flow once to obtain the refresh token, which the action then uses for all subsequent uploads.
Send Output via Email
- name: Send Email
uses: dawidd6/action-send-mail@v3
with:
server_address: smtp.gmail.com
server_port: 465
username: ${{ secrets.EMAIL_USER }}
password: ${{ secrets.EMAIL_PASSWORD }}
subject: Python Script Output
body: Attached is today's output.
attachments: output.txt
For Gmail, use an App Password rather than the account password. Store both values in GitHub Secrets.
Common Issues and Fixes
Script runs locally but fails in the workflow. The most common cause is a dependency missing from requirements.txt. Add pip list as a step before the run step to log what is installed, then compare against what the script imports.
The scheduled run is not triggering. GitHub pauses scheduled workflows on repositories with no activity for 60 days. Push a commit or manually trigger the workflow from the Actions tab to reactivate it.
The artefact upload step fails with "no files found". The path in upload-artifact must match exactly where the script writes its output. Print the working directory in the script with print(os.getcwd()) and verify the output file path is correct.
Key Takeaways
GitHub Actions runs Python scripts automatically on push or a cron schedule without requiring a server.
Pinning dependency versions in
requirements.txtensures consistent behavior across every workflow run.Artefacts store output files for 90 days and are downloadable directly from the Actions tab.
Google Drive uploads and email delivery are available through third-party actions, with credentials stored in GitHub Secrets.
Scheduled workflows on inactive repositories are paused after 60 days and need to be manually reactivated.
Conclusion
Any Python script that runs manually on a schedule is a candidate for automation with GitHub Actions. The setup is low overhead: a repository, a workflow file, and the existing script. The result is a process that runs reliably whether or not anyone is at the machine, with output stored and accessible without any manual steps.
The workflow structure covered here scales to more complex scripts without changing the approach. Additional steps, conditional logic, matrix builds across Python versions, and integration with other tools all follow the same YAML-based pattern established in this guide.
Automating a specific Python workflow and running into a problem? Describe it in the comments.




