1
0
mirror of https://github.com/wyot1/GeoLite2-Unwalled.git synced 2025-04-18 17:44:06 +03:00

amusing badges

This commit is contained in:
wyot1 2023-06-19 11:46:01 +00:00
commit 870154db44
5 changed files with 536 additions and 0 deletions

47
.github/workflows/geolite2.yml vendored Normal file
View File

@ -0,0 +1,47 @@
name: geolite2
on:
workflow_dispatch:
schedule:
- cron: '0 20 * * *'
jobs:
update:
runs-on: ubuntu-22.04
container:
image: debian:12
permissions:
contents: write
actions: write
timeout-minutes: 60
steps:
- name: set up some GITHUB_ENV
run: |
echo "REPOSITORY_NAME=${GITHUB_REPOSITORY#*/}" >> ${GITHUB_ENV}
echo "USER_NAME=${{ github.actor }}" >> ${GITHUB_ENV}
echo "TAG_NAME=$(date +"%Y-%m-%d_%H-%M-%S")" >> ${GITHUB_ENV}
echo "MM_KEY=${{ secrets.MM_KEY }}" >> ${GITHUB_ENV}
- name: run script
shell: bash
env:
GH_TOKEN: ${{ github.token }}
DEBIAN_FRONTEND: noninteractive
run: |
echo " # Installing basic deps"
>/dev/null apt-get -yqq update && >/dev/null apt-get -yqq upgrade
>/dev/null apt-get -yqq install git curl
mkdir -p /root/.ssh
echo " # adding private ssh key"
eval "$(ssh-agent -s)"
echo "${{ secrets.PRIVATE_DEPLOY_KEY }}" > /root/.ssh/PRIVATE_DEPLOY_KEY
chmod 0400 /root/.ssh/PRIVATE_DEPLOY_KEY ; ssh-add /root/.ssh/PRIVATE_DEPLOY_KEY
echo " # scanning github to avoid the question"
ssh-keyscan github.com >> /root/.ssh/known_hosts
echo " # shallow clone of master"
git clone "git@github.com:${{ github.repository }}.git" --depth=1 --branch master --single-branch master
echo " # clone time"
git clone "git@github.com:${{ github.repository }}.git" --depth=1 --branch time --single-branch time || git init time
echo " # running the target script"
bash ./master/run.sh

127
README.md Normal file
View File

@ -0,0 +1,127 @@
# GeoLite2-Unwalled
*Fresh GeoLite2 databases, for everyone.*
![badge-build](https://img.shields.io/github/actions/workflow/status/wyot1/GeoLite2-Unwalled/geolite2.yml?labelColor=18122B&logo=data:image/svg+xml;base64,PHN2ZyB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciIHZpZXdCb3g9IjUgMS44OCAyMiAyOC4xNSI+PHBhdGggZmlsbD0iI2ZmZiIgZD0iTTEzLjYgMkMxMi41IDIgOSAzIDcuNyA2LjRMNi40IDEwbDEgLjQtLjQuOS0xLjQuNy0uNiAxLjggNC43IDEuNy42LTEuOC0uNS0xLjQuMy0xIDEgLjRMMTIuOCA3cy0uOC0uOC0uMy0yLjJsMS0yLjl6bS4xIDUuNUwxMyA5LjNsMTEuNCA0LjItLjMgMS0xMS40LTQuMi0uNyAxLjlMMjUuMyAxN2wxLjctNC43LTEzLjMtNC44ek05IDE5djFINXYyYTMgMyAwIDAgMCAzIDNoMXYxaDV2MmgtM3YyaDEydi0yaC0zdi0yaDJhNSA1IDAgMCAwIDUtNXYtMkg5em0yIDJoMTRhMyAzIDAgMCAxLTMgM0gxMXYtM3ptLTQgMWgydjFIOGExIDEgMCAwIDEtMS0xem05IDRoMnYyaC0ydi0yeiI+PC9wYXRoPjwvc3ZnPg==) ![badge-time-build](https://img.shields.io/endpoint?cacheSeconds=5&url=https://raw.githubusercontent.com/wyot1/GeoLite2-Unwalled/time/build) ![badge-time-check](https://img.shields.io/endpoint?cacheSeconds=5&url=https://raw.githubusercontent.com/wyot1/GeoLite2-Unwalled/time/check)
[Download](#download) | [Security](#security) | [Support](#support) | [Joining Split Files](#joining-split-files) | [Screenshot of Member Area](#screenshot-of-member-area) | [GitHub Raw Rate Limiting](#screenshot-of-member-area) | [Legal](#legal)
MaxMind forces a login nowadays, and prohibits VPN/TOR/proxy users from even
signing up. This is basic, monopolized data that should be available to all,
or none.
The files can be distributed legally, and are widely used. MaxMind also offers
commercial databases with higher accuracy, but the 'lite' version is good
enough for research, and generic use.
Currently, MaxMind's free GeoLite2 includes *CITY*, *COUNTRY*, and
*ASN* databases. **All files available upstream are provided "as is".**
Upstream schedule is *Tue/Fri*, as of today. GitHub runs are
unpredictable, so we check *daily*.
## Download
As **GitHub limits file size**, some files may have to be split. The process
handles this automatically, if required.
Master lists of current sets are provided for easy processing on your end.
### Master Lists
| | ASN | CITY | COUNTRY |
|-------------------|:-------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------------:|
| **GitHub Raw** | | | |
| CSV | [URL](https://github.com/wyot1/GeoLite2-Unwalled/raw/downloads/LIST/GHRAW/master-ASN_CSV.lst) | [URL](https://github.com/wyot1/GeoLite2-Unwalled/raw/downloads/LIST/GHRAW/master-CITY_CSV.lst) | [URL](https://github.com/wyot1/GeoLite2-Unwalled/raw/downloads/LIST/GHRAW/master-COUNTRY_CSV.lst) |
| CSV-ZST | [URL](https://github.com/wyot1/GeoLite2-Unwalled/raw/downloads/LIST/GHRAW/master-ASN_CSV-ZST.lst) | [URL](https://github.com/wyot1/GeoLite2-Unwalled/raw/downloads/LIST/GHRAW/master-CITY_CSV-ZST.lst) | [URL](https://github.com/wyot1/GeoLite2-Unwalled/raw/downloads/LIST/GHRAW/master-COUNTRY_CSV-ZST.lst) |
| MMDB | [URL](https://github.com/wyot1/GeoLite2-Unwalled/raw/downloads/LIST/GHRAW/master-ASN_MMDB.lst) | [URL](https://github.com/wyot1/GeoLite2-Unwalled/raw/downloads/LIST/GHRAW/master-CITY_MMDB.lst) | [URL](https://github.com/wyot1/GeoLite2-Unwalled/raw/downloads/LIST/GHRAW/master-COUNTRY_MMDB.lst) |
| MMDB-ZST | [URL](https://github.com/wyot1/GeoLite2-Unwalled/raw/downloads/LIST/GHRAW/master-ASN_MMDB-ZST.lst) | [URL](https://github.com/wyot1/GeoLite2-Unwalled/raw/downloads/LIST/GHRAW/master-CITY_MMDB-ZST.lst) | [URL](https://github.com/wyot1/GeoLite2-Unwalled/raw/downloads/LIST/GHRAW/master-COUNTRY_MMDB-ZST.lst) |
| UPSTREAM | [URL](https://github.com/wyot1/GeoLite2-Unwalled/raw/downloads/LIST/GHRAW/master-ASN_UPSTREAM.lst) | [URL](https://github.com/wyot1/GeoLite2-Unwalled/raw/downloads/LIST/GHRAW/master-CITY_UPSTREAM.lst) | [URL](https://github.com/wyot1/GeoLite2-Unwalled/raw/downloads/LIST/GHRAW/master-COUNTRY_UPSTREAM.lst) |
Plug a list into your `curl`/`wget`/etc. and reliably fetch, even, if upstream
size fluctuates. No need to mess with, or install `git`, just to get one part.
### Individual Files
It is up to you to decide how to handle it. If the filesize changes, some
files may be split in the future that aren't today.
You can re-use the direct links too, of course, if reliability is not a concern.
An example:
You download `GeoLite2-City.mmdb` today, it's just below GitHub's limit
(50M soft, 100M hard), let's say 99M. Next month it's 102M, so it will be split
into `GeoLite2-City.mmdb.00` and `GeoLite2-City.mmdb.01`.
Now your process fails.
*If, instead, you use the master lists, you download the list, join the files
automatically, and don't have to worry.*
[Read on.](#joining-split-files)
If you still want a single link, the ZST version has a higher chance of staying
below the limit.
### Compression
For convenience and saving bandwidth, re-compressed versions are also made
available. High level zstandard is used. It's available almost everywhere and
works well on these. (`apt install zstd`)
## Security
My process does not depend on 3rd party actions that may lead to dramas like
the Ukraine JS malware thing. You trust only me, GitHub, and MaxMind.
Before considering using such databases for blacklisting, please think twice
about innocent victims such as VPN/TOR/proxy users. Blacklisting countries is
futile anyway, as state-sponsored attackers and similar modern adversaries can
easily buy US/EU retail IPs en masse.
## Support
If this helped you, please take 5 minutes to read **insert support link**.
## Joining Split Files
I shared a guide and a ready-made bash tool on here: **insert article link**.
It's simple and reliable, avoiding having to bother with possible filesize
fluctuations.
Until the article is published, here's a minimal example:
```bash
curl -L \
"https://github.com/wyot1/GeoLite2-Unwalled/raw/downloads/LIST/GHRAW/master-CITY_MMDB.lst" \
| xargs -n1 curl -LOJ
cat GeoLite2-City.mmdb.?? > GeoLite2-City.mmdb
ls -lh
```
```
-rw-r--r-- 1 user user 69M Jun 12 08:00 GeoLite2-City.mmdb
-rw-r--r-- 1 user user 50M Jun 12 07:58 GeoLite2-City.mmdb.00
-rw-r--r-- 1 user user 19M Jun 12 07:58 GeoLite2-City.mmdb.01
```
Now you can script this to scale in your workflow, handle single files,
split files, varying numbers. Simple and reliable.
## Screenshot of Member Area
To preclude questions, find a screenshot of the member area
[here](./screenshot.png). That's all there is.
## GitHub Raw Rate Limiting
See [this thread][gh-rate-limit-so] for GitHub rate limits. If you use a token,
you can get 5000 req/h regardless of IP. This might be relevant, if you're
behind VPN/TOR/proxy/CGNAT. GitHub allows signing up via TOR where you can get
said token.
## Legal
The code for the process is mine. The databases obviously belong to MaxMind,
and are distributed legally.
[gh-rate-limit-so]: https://stackoverflow.com/questions/66522261/does-github-rate-limit-access-to-public-raw-files

180
process_geolite2.sh Normal file
View File

@ -0,0 +1,180 @@
#!/usr/bin/env bash
set -euo pipefail
split_files() {
original_dir="$(pwd)"
for dir in "${@}"; do
cd "${dir}"
while read -r file; do
size=$(stat --printf="%s" "${file}")
if ((size > MAX_SIZE)); then
echo >&2 "File ${file} is too large, splitting"
split -b "${MAX_SIZE}" -d "${file}" "${file}."
rm "${file}"
fi
done < <(find . -mindepth 1 -not -path '*/.*' -type f)
cd "${original_dir}"
done
}
curl_return_ename() {
cd "${1}"
curl -sSLOJ \
-w "%{filename_effective}" \
"${2}"
}
create_lists() {
for dir in "${@:2}"; do
while read -r path; do
case "${1}" in
"GHRAW")
BASE_URL="https://github.com/wyot1/GeoLite2-Unwalled/raw/downloads/"
;;
"POTENTIAL_CDN")
BASE_URL="https://cdn.unnamed"
;;
esac
echo "${BASE_URL}${path}" >>"./LIST/${1}/master-${dir/\//_}.lst"
done < <(find ./"${dir}" -mindepth 1 -maxdepth 1 -type f | sed 's|^./||' \
| sort)
done
}
save_timestamp() {
type=$(<<<"${1}" sed -nE 's/^GeoLite2-(.+)_[0-9]{8}\..+$/\1/p')
if grep -wq "${type}" <<<"ASN ASN-CSV City City-CSV Country Country-CSV"; then
new_timestamp=$(<<<"${1}" sed -nE 's/^.+_([0-9]{8})\..+$/\1/p')
echo "${new_timestamp}" > "${ROOT_DIR}/time/${type}"
fi
}
MAX_SIZE=$((100 * 1024 ** 2))
ZST_LEVEL=19
TEMPDIR=$(mktemp -d)
SLEEP_MAX=15 # makes correlating this too annoying for most
mkdir -p time; touch -a time/{ASN,ASN-CSV,City,City-CSV,Country,Country-CSV}
mkdir work; cd ./work
mkdir ASN CITY COUNTRY LIST
cd ASN
mkdir CSV CSV-ZST MMDB MMDB-ZST UPSTREAM
>&2 echo "Downloading ASN"
ASN_SRC=(
"https://download.maxmind.com/app/geoip_download?edition_id=GeoLite2-ASN&license_key=${MM_KEY}&suffix=tar.gz"
"https://download.maxmind.com/app/geoip_download?edition_id=GeoLite2-ASN&license_key=${MM_KEY}&suffix=tar.gz.sha256"
"https://download.maxmind.com/app/geoip_download?edition_id=GeoLite2-ASN-CSV&license_key=${MM_KEY}&suffix=zip"
"https://download.maxmind.com/app/geoip_download?edition_id=GeoLite2-ASN-CSV&license_key=${MM_KEY}&suffix=zip.sha256"
)
idx=0
for src in "${ASN_SRC[@]}"; do
sleep "$(shuf -i 0-"${SLEEP_MAX}" -n1)"
filename=$(curl_return_ename "${TEMPDIR}" "${src}")
save_timestamp "${filename}"
mv "${TEMPDIR}/${filename}" "./UPSTREAM/${idx}.${filename}"
idx=$((++idx))
done
cp ./UPSTREAM/*.*ASN_*.tar.gz "${TEMPDIR}"
tar xf "${TEMPDIR}/"*.tar.gz -C "${TEMPDIR}"
find "${TEMPDIR}" -name "*.mmdb" -exec mv "{}" ./MMDB \;
find "${TEMPDIR}" -mindepth 1 -delete
>&2 echo "Compressing MMDB"
find ./MMDB -mindepth 1 -type f -exec zstd -q -T0 -"${ZST_LEVEL}" --long \
--output-dir-flat ./MMDB-ZST "{}" \;
cp ./UPSTREAM/*.*ASN-CSV_*.zip "${TEMPDIR}"
unzip -q "${TEMPDIR}/"*.zip -d "${TEMPDIR}"
find "${TEMPDIR}" -name "*.csv" -exec mv "{}" ./CSV \;
find "${TEMPDIR}" -mindepth 1 -delete
>&2 echo "Compressing CSV"
find ./CSV -mindepth 1 -type f -exec zstd -q -T0 -"${ZST_LEVEL}" --long \
--output-dir-flat ./CSV-ZST "{}" \;
>&2 echo "Looking for splits"
split_files CSV CSV-ZST MMDB MMDB-ZST UPSTREAM
cd ..
cd CITY
mkdir CSV CSV-ZST MMDB MMDB-ZST UPSTREAM
>&2 echo "Downloading CITY"
CITY_SRC=(
"https://download.maxmind.com/app/geoip_download?edition_id=GeoLite2-City&license_key=${MM_KEY}&suffix=tar.gz"
"https://download.maxmind.com/app/geoip_download?edition_id=GeoLite2-City&license_key=${MM_KEY}&suffix=tar.gz.sha256"
"https://download.maxmind.com/app/geoip_download?edition_id=GeoLite2-City-CSV&license_key=${MM_KEY}&suffix=zip"
"https://download.maxmind.com/app/geoip_download?edition_id=GeoLite2-City-CSV&license_key=${MM_KEY}&suffix=zip.sha256"
)
idx=0
for src in "${CITY_SRC[@]}"; do
sleep "$(shuf -i 0-"${SLEEP_MAX}" -n1)"
filename=$(curl_return_ename "${TEMPDIR}" "${src}")
save_timestamp "${filename}"
mv "${TEMPDIR}/${filename}" "./UPSTREAM/${idx}.${filename}"
idx=$((++idx))
done
cp ./UPSTREAM/*.*City_*.tar.gz "${TEMPDIR}"
tar xf "${TEMPDIR}/"*.tar.gz -C "${TEMPDIR}"
find "${TEMPDIR}" -name "*.mmdb" -exec mv "{}" ./MMDB \;
find "${TEMPDIR}" -mindepth 1 -delete
>&2 echo "Compressing MMDB"
find ./MMDB -mindepth 1 -type f -exec zstd -q -T0 -"${ZST_LEVEL}" --long \
--output-dir-flat ./MMDB-ZST "{}" \;
cp ./UPSTREAM/*.*City-CSV_*.zip "${TEMPDIR}"
unzip -q "${TEMPDIR}/"*.zip -d "${TEMPDIR}"
find "${TEMPDIR}" -name "*.csv" -exec mv "{}" ./CSV \;
find "${TEMPDIR}" -mindepth 1 -delete
>&2 echo "Compressing CSV"
find ./CSV -mindepth 1 -type f -exec zstd -q -T0 -"${ZST_LEVEL}" --long \
--output-dir-flat ./CSV-ZST "{}" \;
>&2 echo "Looking for splits"
split_files CSV CSV-ZST MMDB MMDB-ZST UPSTREAM
cd ..
cd COUNTRY
mkdir CSV CSV-ZST MMDB MMDB-ZST UPSTREAM
>&2 echo "Downloading COUNTRY"
COUNTRY_SRC=(
"https://download.maxmind.com/app/geoip_download?edition_id=GeoLite2-Country&license_key=${MM_KEY}&suffix=tar.gz"
"https://download.maxmind.com/app/geoip_download?edition_id=GeoLite2-Country&license_key=${MM_KEY}&suffix=tar.gz.sha256"
"https://download.maxmind.com/app/geoip_download?edition_id=GeoLite2-Country-CSV&license_key=${MM_KEY}&suffix=zip"
"https://download.maxmind.com/app/geoip_download?edition_id=GeoLite2-Country-CSV&license_key=${MM_KEY}&suffix=zip.sha256"
)
idx=0
for src in "${COUNTRY_SRC[@]}"; do
sleep "$(shuf -i 0-"${SLEEP_MAX}" -n1)"
filename=$(curl_return_ename "${TEMPDIR}" "${src}")
save_timestamp "${filename}"
mv "${TEMPDIR}/${filename}" "./UPSTREAM/${idx}.${filename}"
idx=$((++idx))
done
cp ./UPSTREAM/*.*Country_*.tar.gz "${TEMPDIR}"
tar xf "${TEMPDIR}/"*.tar.gz -C "${TEMPDIR}"
find "${TEMPDIR}" -name "*.mmdb" -exec mv "{}" ./MMDB \;
find "${TEMPDIR}" -mindepth 1 -delete
>&2 echo "Compressing MMDB"
find ./MMDB -mindepth 1 -type f -exec zstd -q -T0 -"${ZST_LEVEL}" --long \
--output-dir-flat ./MMDB-ZST "{}" \;
cp ./UPSTREAM/*.*Country-CSV_*.zip "${TEMPDIR}"
unzip -q "${TEMPDIR}/"*.zip -d "${TEMPDIR}"
find "${TEMPDIR}" -name "*.csv" -exec mv "{}" ./CSV \;
find "${TEMPDIR}" -mindepth 1 -delete
>&2 echo "Compressing CSV"
find ./CSV -mindepth 1 -type f -exec zstd -q -T0 -"${ZST_LEVEL}" --long \
--output-dir-flat ./CSV-ZST "{}" \;
>&2 echo "Looking for splits"
split_files CSV CSV-ZST MMDB MMDB-ZST UPSTREAM
cd ..
cd LIST
mkdir GHRAW JSDELIVR
cd ..
create_lists GHRAW ASN/CSV ASN/CSV-ZST ASN/MMDB \
ASN/MMDB-ZST ASN/UPSTREAM
create_lists GHRAW CITY/CSV CITY/CSV-ZST CITY/MMDB \
CITY/MMDB-ZST CITY/UPSTREAM
create_lists GHRAW COUNTRY/CSV COUNTRY/CSV-ZST COUNTRY/MMDB \
COUNTRY/MMDB-ZST COUNTRY/UPSTREAM
>&2 echo "Deleting tempdir"
rm -rf "${TEMPDIR}"
save_timestamp_badge build build
>&2 echo "Finished processing mm files"

182
run.sh Normal file
View File

@ -0,0 +1,182 @@
#!/usr/bin/env bash
set -euo pipefail
# These are necessary as gh quite often bugs inexplicably.
failsafe() {
if ((${1} >= MAX_ATTEMPTS)); then
FAILSAFE_NOTE="Github failed even after ${MAX_ATTEMPTS} attempts incl. "
FAILSAFE_NOTE+="waits. Exiting to avoid tickling the rate limiter."
>&2 echo "${FAILSAFE_NOTE}"
exit 1
fi
}
save_timestamp_badge() {
timestamp_badge_json='{ "schemaVersion": 1, "label": "'"${1}"'", "message": "'"$(date -u +"%a %b %d %Y %H:%M %z")"'", "labelColor": "18122B", "color": "107CD2", "logoSvg": "<svg xmlns=\"http:\/\/www.w3.org\/2000\/svg\" viewBox=\"4 4 24 24\"><path fill=\"#fff\" d=\"M16 4a12 12 0 1 0 0 24 12 12 0 0 0 0-24Zm0 2a10 10 0 1 1 0 20 10 10 0 0 1 0-20Zm-1 2v9h7v-2h-5V8Z\"\/><\/svg>"}'
echo "${timestamp_badge_json}" > "${ROOT_DIR}/time/${2}"
}
export -f save_timestamp_badge
>&2 echo " # Basic env info"
>&2 echo "uname: $(uname -a)"
>&2 df -h .
>&2 grep -c processor /proc/cpuinfo
>&2 grep -E 'model name' -m 1 /proc/cpuinfo
>&2 grep 'MemAvailable' /proc/meminfo
export ROOT_DIR="$(pwd)"
>&2 echo " # Testing remote for update"
mkdir -p "${ROOT_DIR}/time"
touch -a time/{ASN,ASN-CSV,City,City-CSV,Country,Country-CSV}
save_timestamp_badge check check
declare -A TEST_SET
TEST_SET=(
[ASN]="https://download.maxmind.com/app/geoip_download?edition_id=GeoLite2-ASN&license_key=${MM_KEY}&suffix=tar.gz"
[ASN-CSV]="https://download.maxmind.com/app/geoip_download?edition_id=GeoLite2-ASN-CSV&license_key=${MM_KEY}&suffix=zip"
[City]="https://download.maxmind.com/app/geoip_download?edition_id=GeoLite2-City&license_key=${MM_KEY}&suffix=tar.gz"
[City-CSV]="https://download.maxmind.com/app/geoip_download?edition_id=GeoLite2-City-CSV&license_key=${MM_KEY}&suffix=zip"
[Country]="https://download.maxmind.com/app/geoip_download?edition_id=GeoLite2-Country&license_key=${MM_KEY}&suffix=tar.gz"
[Country-CSV]="https://download.maxmind.com/app/geoip_download?edition_id=GeoLite2-Country-CSV&license_key=${MM_KEY}&suffix=zip"
)
current=1
for type in "${!TEST_SET[@]}"; do
sleep "$(shuf -i 0-15 -n1)"
content_disposition=$(curl -sSLI -o/dev/null -w '%header{content-disposition}' "${TEST_SET[${type}]}")
content_disposition="${content_disposition##attachment; filename=}"
new_timestamp=$(<<<"${content_disposition}" sed -nE 's/^.+_([0-9]{8})\..+$/\1/p')
cur_timestamp=$(<"time/${type}")
if [[ -z new_timestamp ]]; then
>&2 echo " # new timestamp is empty, something's up"
exit 1
fi
if (( new_timestamp > cur_timestamp )); then
current=0
break
fi
done
if (( current==0 )); then
>&2 echo " # remote is newer, allowing update"
else
>&2 echo " # local still seems to be current, skipping build"
fi
if (( current==0 )); then
>&2 echo " # Installing deps"
curl -sSL https://cli.github.com/packages/githubcli-archive-keyring.gpg \
-o /usr/share/keyrings/githubcli-archive-keyring.gpg
chmod go+r /usr/share/keyrings/githubcli-archive-keyring.gpg
GH_CLI_REPO="deb [arch=$(dpkg --print-architecture) "
GH_CLI_REPO+="signed-by=/usr/share/keyrings/githubcli-archive-keyring.gpg] "
GH_CLI_REPO+="https://cli.github.com/packages stable main"
echo "${GH_CLI_REPO}" >>/etc/apt/sources.list.d/github-cli.list
>/dev/null apt-get -yqq update
>/dev/null apt-get -yqq install curl gh unzip zstd
tmp=" # Processing the files. This includes random delays to make correlation \
too annoying."
>&2 echo "${tmp}"
bash master/process_geolite2.sh
fi
>&2 echo " # Preparing git"
git config --global init.defaultBranch master
git config --global user.name "${USER_NAME}"
git config --global user.email "${USER_NAME}@users.noreply.github.com"
MAX_ATTEMPTS=10
FS_SLEEP=5
if (( current==0 )); then
>&2 echo " # Pushing downloads to git"
cd work
git init
git checkout -b downloads
git add .
git commit -m "${TAG_NAME}"
git remote add origin "git@github.com:${USER_NAME}/${REPOSITORY_NAME}.git"
# ssh is apparently less buggy than github's https
# git remote add origin "https://${USER_NAME}:${GH_TOKEN}@github.com/\
# ${USER_NAME}/${REPOSITORY_NAME}"
rc=1 ; attempts=0
while ((rc != 0)); do
((++attempts))
git push -f origin downloads \
&& rc=${?} || rc=${?} && sleep "${FS_SLEEP}"
>&2 echo "git push rc: ${rc}"
failsafe "${attempts}"
done
cd ..
fi
>&2 echo " # Pushing time to git"
cd time
git remote add origin "git@github.com:${USER_NAME}/${REPOSITORY_NAME}.git" \
|| true
git checkout --orphan foo
git add -A
git commit -am "${TAG_NAME}"
git branch -D time || true
git branch -m time
rc=1 ; attempts=0
while ((rc != 0)); do
((++attempts))
git push -f origin time \
&& rc=${?} || rc=${?} && sleep "${FS_SLEEP}"
>&2 echo "git push rc: ${rc}"
failsafe "${attempts}"
done
cd ..
if (( current==0 )); then
>&2 echo " # Pruning"
cd work
rc=1 ; attempts=0
while ((rc != 0)); do
((++attempts))
git remote prune origin \
&& rc=${?} || rc=${?} && sleep "${FS_SLEEP}"
>&2 echo "git remote prune rc: ${rc}"
failsafe "${attempts}"
done
cd ..
>&2 echo " # Creating release"
cd work
ls -lah
RELEASE_NOTE="Updated files.
Releases serve as update indicators - see the README for the master lists, and \
an explanation. The \`downloads\` branch holds the current stack."
rc=1 ; attempts=0
while ((rc != 0)); do
((++attempts))
gh release create -n "${RELEASE_NOTE}" "${TAG_NAME}" ../master/README.md \
&& rc=${?} || rc=${?} && sleep "${FS_SLEEP}"
>&2 echo "gh release create rc: ${rc}"
failsafe "${attempts}"
done
>&2 echo " # Deleting previous releases"
rc=1 ; attempts=0
while ((rc != 0)); do
((++attempts))
gh release list -L 999999999 | awk '{ print $1 }' | tail -n +13 \
| xargs -I{} gh release delete --cleanup-tag -y "{}" \
&& rc=${?} || rc=${?} && sleep "${FS_SLEEP}"
>&2 echo "gh release delete rc: ${rc}"
failsafe "${attempts}"
done
>&2 echo " # Deleting previous workflows"
CLEAN_TIMESTAMP=$(date -u --date="-30 day" "+%Y-%m-%d")
rc=1 ; attempts=0
while ((rc != 0)); do
((++attempts))
gh run list -R "${USER_NAME}/${REPOSITORY_NAME}" -L 999999999 \
--json databaseId --created "<${CLEAN_TIMESTAMP}" -q '.[].databaseId' \
| xargs -I{} gh run delete -R "${USER_NAME}/${REPOSITORY_NAME}" {} \
&& rc=${?} || rc=${?} && sleep "${FS_SLEEP}"
>&2 echo "gh workflow delete rc: ${rc}"
failsafe "${attempts}"
done
fi

BIN
screenshot.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 397 KiB