The `gnupg` package from Homebrew only installs a `gpg` binary, not a `gpg2` binary. I had previously worked around this by manually creating an alias, but I think we can do better.
GPG version 1 is ancient and [hasn't seen a release since 2006](https://gnupg.org/download/release_notes.html). Additionally, `gpg` has referred to GPG 2 in Ubuntu since at least 20.04 which is the oldest non-EOL'd version as of writing this so I think this change is safe to make.
Implement an Authenticator which can fulfill a dns-01 challenge using the OVH DNS API. Applicable only for domains using OVH DNS.
Testing Done:
* `tox -e py27`
* `tox -e lint`
* Manual testing:
* Used `certbot certonly --dns-ovh -d`, specifying a credentials file as a command line argument. Verified that a certificate was successfully obtained without user interaction.
* Used `certbot certonly --dns-ovh -d`, without specifying a credentials file as a command line argument. Verified that the user was prompted and that a certificate was successfully obtained.
* Used `certbot certonly -d`. Verified that the user was prompted for a credentials file after selecting dnsimple interactively and that a certificate was successfully obtained.
* Used `certbot renew --force-renewal`. Verified that certificates
were renewed without user interaction.
* Negative testing:
* Path to non-existent credentials file.
* Credentials file with unsafe permissions (644).
* Path to credentials file with an invalid application key.
* Path to credentials file with an invalid application secret.
* Path to credentials file with an invalid consumer key.
* Path to credentials file with missing properties.
* Domain name not registered to OVH account.
Implement an Authenticator which can fulfill a dns-01 challenge using
the Gehirn DNS (Gehirn Infrastructure Service) API.
Applicable only for domains using Gehirn DNS for DNS.
Testing Done:
* `tox -e py27`
* `tox -e lint`
* Manual testing:
* Used `certbot certonly --dns-gehirn -d`, specifying a
credentials file as a command line argument. Verified that a
certificate was successfully obtained without user interaction.
* Negative testing:
* Path to non-existent credentials file.
* Credentials file with unsafe permissions (644).
* Domain name not registered to Gehirn DNS account.
Implement an Authenticator which can fulfill a dns-01 challenge using
the Sakura Cloud DNS API.
Applicable only for domains using Sakura Cloud for DNS.
Testing Done:
* `tox -e py27`
* `tox -e lint`
* Manual testing:
* Used `certbot certonly --dns-sakuracloud -d`, specifying a
credentials file as a command line argument. Verified that a
certificate was successfully obtained without user interaction.
* Negative testing:
* Path to non-existent credentials file.
* Credentials file with unsafe permissions (644).
* Domain name not registered to Sakura Cloud account.
* Added DNS based authenticator plugin for Linode
* Added linode plugin to docs
* Added Dockerfile
* Added .gitignore and readthedocs.org.requirements.txt
* Updated default_propagation_seconds
* Updated according to changes requested
* Bump version to 0.26.0
* Advertise our packages work on Python 3.7.
This allows us to depend on packages like acme>=0.26.0.dev0 during development
and automatically change it to acme>=0.26.0 during the release. We use `git add
-p` to be safe, but if .dev0 is used at all in our released setup.py files,
we're probably doing something wrong.
* Use pipstrap to install a good version of pip
* Use pytest in cb-auto tests
* Remove nose usage in auto_test.py
* remove nose dev dep
* use pytest in test_tests
* Use pytest in tox
* Update dev dependency pinnings
* remove nose multiprocess lines
* Use pytest for coverage
* Use older py and pytest for old python versions
* Add test for Error.__str__
* pin pytest in oldest test
* Fix tests for DNS-DO plugin on py26
* Work around bug for Python 3.3
* Clarify dockerfile comments
Introduce a plugin that automates the process of completing a dns-01 challenge by creating, and subsequently removing, TXT records using RFC 2136 Dynamic Updates (a.k.a. nsupdate).
This plugin has been tested with BIND, but may work with other RFC 2136-compatible DNS servers, such as PowerDNS.
Implement an Authenticator which can fulfill a dns-01 challenge using
the LuaDNS API. Applicable only for domains using LuaDNS for DNS.
Testing Done:
* `tox -e py27`
* `tox -e lint`
* Manual testing:
* Used `certbot certonly --dns-luadns -d`, specifying a
credentials file as a command line argument. Verified that a
certificate was successfully obtained without user interaction.
* Negative testing:
* Path to non-existent credentials file.
* Credentials file with unsafe permissions (644).
* Path to credentials file without an email.
* Path to credentials file with an invalid email.
* Path to credentials file without a token.
* Path to credentials file with an invalid token.
* Domain name not registered to LuaDNS account.
This change refactors the release script to handle subpackages which are
not bundled as a part of cerbot-auto.
The script now allows developers to define subpackages as either being
included in certbot-auto, or not.
The script then uses one of three sets of subpackages for each operation:
* The version number is updated for all non-certbot subpackages
(and certbot itself is handled separately)
* sdists and wheels are created for all non-certbot subpackages
(and certbot itself is handled separately)
* Testing is performed for all subpackages
* Hashes are pinned for certbot-auto subpackages (including certbot)
* Revert "Pin python-augeas version to avoid error with 1.0.0 (#4422)"
This reverts commit 1c51ae2588.
* make dependency-requirements
* separate certbot and dependency requirements
* fix build.py
* update hashin comment
* simplify release pinning
* separate letsencrypt dependency
* pin hashes in venv
* error out when bad things happen
* use pinned dependencies in tox
* Revert "pin hashes in venv"
This reverts commit 1cd38a9e50.
* use pip_install.sh in venv_common
* quote pip install args
* bump mock version
doesn't work if you don't have `pip` installed (like me) and I think using
`pip` from the venv should be preferred to ensure you are using the latest
`pip` (which was updated in the venv earlier in the script).
letsencrypt-auto-requirements.txt that will change with every release. This
change strips the hashes of the previous packages before adding the new ones.
It will always be a copy of the latest release version, 0.4 in this case. (Modify the release script to make that so.) This way, people using the old method of running le-auto from a git checkout will not end up using a bleeding-edge version, letting us work on the tip-of-tree version more freely.
ConfigArgParse has a conditional dependency for Pythons < 2.7. On my local machine, I had a cached ConfigArgParse wheel built under 2.7, so it didn't carry those dependencies, and the pip freeze I used to determine the le-auto requirements thus missed it. From now on, we'll do those passes with --no-cache-dir.