D6319: automation: initial support for running Linux tests

indygreg (Gregory Szorc) phabricator at mercurial-scm.org
Sat Apr 27 14:52:03 EDT 2019


indygreg created this revision.
Herald added subscribers: mercurial-devel, mjpieters.
Herald added a reviewer: hg-reviewers.

REVISION SUMMARY
  Building on top of our Windows automation support, this commit
  implements support for performing automated tasks on remote Linux
  machines. Specifically, we implement support for running tests
  on ephemeral EC2 instances. This seems to be a worthwhile place
  to start, as building packages on Linux is more or less a solved
  problem because we already have facilities for building in Docker
  containers, which provide "good enough" reproducibility guarantees.
  
  The new `run-tests-linux` command works similarly to
  `run-tests-windows`: it ensures an AMI with hg dependencies is
  available, provisions a temporary EC2 instance with this AMI, pushes
  local changes to that instance via SSH, then invokes `run-tests.py`.
  
  Using this new command, I am able to run the entire test harness
  substantially faster then I am on my local machine courtesy of
  access to massive core EC2 instances:
  
  wall: 16:20 ./run-tests.py -l (i7-6700K)
  wall: 14:00 automation.py run-tests-linux --ec2-instance c5.2xlarge
  wall:  8:30 automation.py run-tests-linux --ec2-instance m5.4xlarge
  wall:  8:04 automation.py run-tests-linux --ec2-instance c5.4xlarge
  wall:  4:30 automation.py run-tests-linux --ec2-instance c5.9xlarge
  wall:  3:57 automation.py run-tests-linux --ec2-instance m5.12xlarge
  wall:  3:05 automation.py run-tests-linux --ec2-instance m5.24xlarge
  wall:  3:02 automation.py run-tests-linux --ec2-instance c5.18xlarge
  
  ~3 minute wall time to run pretty much the entire test harness is
  not too bad!
  
  The AMIs install multiple versions of Python. And the run-tests-linux
  command specifies which one to use:
  
  automation.py run-tests-linux --python system3
  automation.py run-tests-linux --python 3.5
  automation.py run-tests-linux --python pypy2.7
  
  By default, the system Python 2.7 is used. Using this functionality,
  I was able to identity some unexpected test failures on PyPy!
  
  Included in the feature is support for running with alternate
  filesystems. You can simply pass --filesystem to the command to
  specify the type of filesystem to run tests on. When the ephemeral
  instance is started, a new filesystem will be created and tests
  will run from it:
  
  wall:  4:30 automation.py run-tests-linux --ec2-instance c5.9xlarge
  wall:  4:20 automation.py run-tests-linux --ec2-instance c5d.9xlarge --filesystem xfs
  wall:  4:24 automation.py run-tests-linux --ec2-instance c5d.9xlarge --filesystem tmpfs
  wall:  4:26 automation.py run-tests-linux --ec2-instance c5d.9xlarge --filesystem ext4
  
  We also support multiple Linux distributions:
  
  $ automation.py run-tests-linux --distro debian9
  total time: 298.1s; setup: 60.7s; tests: 237.5s; setup overhead: 20.4%
  
  $ automation.py run-tests-linux --distro ubuntu18.04
  total time: 286.1s; setup: 61.3s; tests: 224.7s; setup overhead: 21.4%
  
  $ automation.py run-tests-linux --distro ubuntu18.10
  total time: 278.5s; setup: 58.2s; tests: 220.3s; setup overhead: 20.9%
  
  $ automation.py run-tests-linux --distro ubuntu19.04
  total time: 265.8s; setup: 42.5s; tests: 223.3s; setup overhead: 16.0%
  
  Debian and Ubuntu are supported because those are what I use and am
  most familiar with. It should be easy enough to add support for other
  distros.
  
  Unlike the Windows AMIs, Linux EC2 instances bill per second. So
  the cost to instantiating an ephemeral instance isn't as severe.
  That being said, there is some overhead, as it takes several dozen
  seconds for the instance to boot, push local changes, and build
  Mercurial. During this time, the instance is largely CPU idle and
  wasting money. Even with this inefficiency, running tests is
  relatively cheap: $0.15-$0.25 per full test run. A machine running
  tests as efficiently as these EC2 instances would cost say $6,000, so
  you can run the test harness a >20,000 times for the cost of an
  equivalent machine. Running tests in EC2 is almost certainly cheaper
  than buying a beefy machine for developers to use :)
  
  1. no-check-commit because foo_bar function names

REPOSITORY
  rHG Mercurial

REVISION DETAIL
  https://phab.mercurial-scm.org/D6319

AFFECTED FILES
  contrib/automation/README.rst
  contrib/automation/hgautomation/aws.py
  contrib/automation/hgautomation/cli.py
  contrib/automation/hgautomation/linux.py
  contrib/automation/hgautomation/ssh.py
  contrib/automation/linux-requirements-py2.txt
  contrib/automation/linux-requirements-py3.txt
  contrib/automation/linux-requirements.txt.in
  contrib/automation/requirements.txt
  contrib/automation/requirements.txt.in
  tests/test-check-code.t

CHANGE DETAILS

diff --git a/tests/test-check-code.t b/tests/test-check-code.t
--- a/tests/test-check-code.t
+++ b/tests/test-check-code.t
@@ -15,6 +15,8 @@
   Skipping contrib/automation/hgautomation/__init__.py it has no-che?k-code (glob)
   Skipping contrib/automation/hgautomation/aws.py it has no-che?k-code (glob)
   Skipping contrib/automation/hgautomation/cli.py it has no-che?k-code (glob)
+  Skipping contrib/automation/hgautomation/linux.py it has no-che?k-code (glob)
+  Skipping contrib/automation/hgautomation/ssh.py it has no-che?k-code (glob)
   Skipping contrib/automation/hgautomation/windows.py it has no-che?k-code (glob)
   Skipping contrib/automation/hgautomation/winrm.py it has no-che?k-code (glob)
   Skipping contrib/packaging/hgpackaging/downloads.py it has no-che?k-code (glob)
diff --git a/contrib/automation/requirements.txt.in b/contrib/automation/requirements.txt.in
--- a/contrib/automation/requirements.txt.in
+++ b/contrib/automation/requirements.txt.in
@@ -1,2 +1,3 @@
 boto3
+paramiko
 pypsrp
diff --git a/contrib/automation/requirements.txt b/contrib/automation/requirements.txt
--- a/contrib/automation/requirements.txt
+++ b/contrib/automation/requirements.txt
@@ -8,6 +8,27 @@
     --hash=sha256:2f1adbb7546ed199e3c90ef23ec95c5cf3585bac7d11fb7eb562a3fe89c64e87 \
     --hash=sha256:9d5c20441baf0cb60a4ac34cc447c6c189024b6b4c6cd7877034f4965c464e49 \
     # via cryptography
+bcrypt==3.1.6 \
+    --hash=sha256:0ba875eb67b011add6d8c5b76afbd92166e98b1f1efab9433d5dc0fafc76e203 \
+    --hash=sha256:21ed446054c93e209434148ef0b362432bb82bbdaf7beef70a32c221f3e33d1c \
+    --hash=sha256:28a0459381a8021f57230954b9e9a65bb5e3d569d2c253c5cac6cb181d71cf23 \
+    --hash=sha256:2aed3091eb6f51c26b7c2fad08d6620d1c35839e7a362f706015b41bd991125e \
+    --hash=sha256:2fa5d1e438958ea90eaedbf8082c2ceb1a684b4f6c75a3800c6ec1e18ebef96f \
+    --hash=sha256:3a73f45484e9874252002793518da060fb11eaa76c30713faa12115db17d1430 \
+    --hash=sha256:3e489787638a36bb466cd66780e15715494b6d6905ffdbaede94440d6d8e7dba \
+    --hash=sha256:44636759d222baa62806bbceb20e96f75a015a6381690d1bc2eda91c01ec02ea \
+    --hash=sha256:678c21b2fecaa72a1eded0cf12351b153615520637efcadc09ecf81b871f1596 \
+    --hash=sha256:75460c2c3786977ea9768d6c9d8957ba31b5fbeb0aae67a5c0e96aab4155f18c \
+    --hash=sha256:8ac06fb3e6aacb0a95b56eba735c0b64df49651c6ceb1ad1cf01ba75070d567f \
+    --hash=sha256:8fdced50a8b646fff8fa0e4b1c5fd940ecc844b43d1da5a980cb07f2d1b1132f \
+    --hash=sha256:9b2c5b640a2da533b0ab5f148d87fb9989bf9bcb2e61eea6a729102a6d36aef9 \
+    --hash=sha256:a9083e7fa9adb1a4de5ac15f9097eb15b04e2c8f97618f1b881af40abce382e1 \
+    --hash=sha256:b7e3948b8b1a81c5a99d41da5fb2dc03ddb93b5f96fcd3fd27e643f91efa33e1 \
+    --hash=sha256:b998b8ca979d906085f6a5d84f7b5459e5e94a13fc27c28a3514437013b6c2f6 \
+    --hash=sha256:dd08c50bc6f7be69cd7ba0769acca28c846ec46b7a8ddc2acf4b9ac6f8a7457e \
+    --hash=sha256:de5badee458544ab8125e63e39afeedfcf3aef6a6e2282ac159c95ae7472d773 \
+    --hash=sha256:ede2a87333d24f55a4a7338a6ccdccf3eaa9bed081d1737e0db4dbd1a4f7e6b6 \
+    # via paramiko
 boto3==1.9.137 \
     --hash=sha256:882cc4869b47b51dae4b4a900769e72171ff00e0b6bca644b2d7a7ad7378f324 \
     --hash=sha256:cd503a7e7a04f1c14d2801f9727159dfa88c393b4004e98940fa4aa205d920c8
@@ -48,7 +69,7 @@
     --hash=sha256:e070535507bd6aa07124258171be2ee8dfc19119c28ca94c9dfb7efd23564512 \
     --hash=sha256:e1ff2748c84d97b065cc95429814cdba39bcbd77c9c85c89344b317dc0d9cbff \
     --hash=sha256:ed851c75d1e0e043cbf5ca9a8e1b13c4c90f3fbd863dacb01c0808e2b5204201 \
-    # via cryptography
+    # via bcrypt, cryptography, pynacl
 chardet==3.0.4 \
     --hash=sha256:84ab92ed1c4d4f16916e05906b6b75a6c0fb5db821cc65e70cbd64a3e2a5eaae \
     --hash=sha256:fc323ffcaeaed0e0a02bf4d117757b98aed530d9ed4531e3e15460124c106691 \
@@ -73,7 +94,7 @@
     --hash=sha256:d4afbb0840f489b60f5a580a41a1b9c3622e08ecb5eec8614d4fb4cd914c4460 \
     --hash=sha256:d9ed28030797c00f4bc43c86bf819266c76a5ea61d006cd4078a93ebf7da6bfd \
     --hash=sha256:e603aa7bb52e4e8ed4119a58a03b60323918467ef209e6ff9db3ac382e5cf2c6 \
-    # via pypsrp
+    # via paramiko, pypsrp
 docutils==0.14 \
     --hash=sha256:02aec4bd92ab067f6ff27a38a38a41173bf01bed8f89157768c1573f53e474a6 \
     --hash=sha256:51e64ef2ebfb29cae1faa133b3710143496eca21c530f3f71424d77687764274 \
@@ -91,9 +112,37 @@
     --hash=sha256:bb2fd03c665f0f62c5f65695b62dcdb07fb7a45df6ebc86c770be2054d6902dd \
     --hash=sha256:ce5b4483ed761f341a538a426a71a52e5a9cf5fd834ebef1d2090f9eef14b3f8 \
     # via pypsrp
+paramiko==2.4.2 \
+    --hash=sha256:3c16b2bfb4c0d810b24c40155dbfd113c0521e7e6ee593d704e84b4c658a1f3b \
+    --hash=sha256:a8975a7df3560c9f1e2b43dc54ebd40fd00a7017392ca5445ce7df409f900fcb
+pyasn1==0.4.5 \
+    --hash=sha256:da2420fe13a9452d8ae97a0e478adde1dee153b11ba832a95b223a2ba01c10f7 \
+    --hash=sha256:da6b43a8c9ae93bc80e2739efb38cc776ba74a886e3e9318d65fe81a8b8a2c6e \
+    # via paramiko
 pycparser==2.19 \
     --hash=sha256:a988718abfad80b6b157acce7bf130a30876d27603738ac39f140993246b25b3 \
     # via cffi
+pynacl==1.3.0 \
+    --hash=sha256:05c26f93964373fc0abe332676cb6735f0ecad27711035b9472751faa8521255 \
+    --hash=sha256:0c6100edd16fefd1557da078c7a31e7b7d7a52ce39fdca2bec29d4f7b6e7600c \
+    --hash=sha256:0d0a8171a68edf51add1e73d2159c4bc19fc0718e79dec51166e940856c2f28e \
+    --hash=sha256:1c780712b206317a746ace34c209b8c29dbfd841dfbc02aa27f2084dd3db77ae \
+    --hash=sha256:2424c8b9f41aa65bbdbd7a64e73a7450ebb4aa9ddedc6a081e7afcc4c97f7621 \
+    --hash=sha256:2d23c04e8d709444220557ae48ed01f3f1086439f12dbf11976e849a4926db56 \
+    --hash=sha256:30f36a9c70450c7878053fa1344aca0145fd47d845270b43a7ee9192a051bf39 \
+    --hash=sha256:37aa336a317209f1bb099ad177fef0da45be36a2aa664507c5d72015f956c310 \
+    --hash=sha256:4943decfc5b905748f0756fdd99d4f9498d7064815c4cf3643820c9028b711d1 \
+    --hash=sha256:57ef38a65056e7800859e5ba9e6091053cd06e1038983016effaffe0efcd594a \
+    --hash=sha256:5bd61e9b44c543016ce1f6aef48606280e45f892a928ca7068fba30021e9b786 \
+    --hash=sha256:6482d3017a0c0327a49dddc8bd1074cc730d45db2ccb09c3bac1f8f32d1eb61b \
+    --hash=sha256:7d3ce02c0784b7cbcc771a2da6ea51f87e8716004512493a2b69016326301c3b \
+    --hash=sha256:a14e499c0f5955dcc3991f785f3f8e2130ed504fa3a7f44009ff458ad6bdd17f \
+    --hash=sha256:a39f54ccbcd2757d1d63b0ec00a00980c0b382c62865b61a505163943624ab20 \
+    --hash=sha256:aabb0c5232910a20eec8563503c153a8e78bbf5459490c49ab31f6adf3f3a415 \
+    --hash=sha256:bd4ecb473a96ad0f90c20acba4f0bf0df91a4e03a1f4dd6a4bdc9ca75aa3a715 \
+    --hash=sha256:e2da3c13307eac601f3de04887624939aca8ee3c9488a0bb0eca4fb9401fc6b1 \
+    --hash=sha256:f67814c38162f4deb31f68d590771a29d5ae3b1bd64b75cf232308e5c74777e0 \
+    # via paramiko
 pypsrp==0.3.1 \
     --hash=sha256:309853380fe086090a03cc6662a778ee69b1cae355ae4a932859034fd76e9d0b \
     --hash=sha256:90f946254f547dc3493cea8493c819ab87e152a755797c93aa2668678ba8ae85
@@ -112,7 +161,7 @@
 six==1.12.0 \
     --hash=sha256:3350809f0555b11f552448330d0b52d5f24c91a322ea4a15ef22629740f3761c \
     --hash=sha256:d16a0141ec1a18405cd4ce8b4613101da75da0e9a7aec5bdd4fa804d0e0eba73 \
-    # via cryptography, pypsrp, python-dateutil
+    # via bcrypt, cryptography, pynacl, pypsrp, python-dateutil
 urllib3==1.24.2 \
     --hash=sha256:4c291ca23bbb55c76518905869ef34bdd5f0e46af7afe6861e8375643ffee1a0 \
     --hash=sha256:9a247273df709c4fedb38c711e44292304f73f39ab01beda9f6b9fc375669ac3 \
diff --git a/contrib/automation/linux-requirements.txt.in b/contrib/automation/linux-requirements.txt.in
new file mode 100644
--- /dev/null
+++ b/contrib/automation/linux-requirements.txt.in
@@ -0,0 +1,12 @@
+# Bazaar doesn't work with Python 3 nor PyPy.
+bzr ; python_version <= '2.7' and platform_python_implementation == 'CPython'
+docutils
+fuzzywuzzy
+pyflakes
+pygments
+pylint
+# Needed to avoid warnings from fuzzywuzzy.
+python-Levenshtein
+# typed-ast dependency doesn't install on PyPy.
+typed-ast ; python_version >= '3.0' and platform_python_implementation != 'PyPy'
+vcrpy
diff --git a/contrib/automation/linux-requirements-py3.txt b/contrib/automation/linux-requirements-py3.txt
new file mode 100644
--- /dev/null
+++ b/contrib/automation/linux-requirements-py3.txt
@@ -0,0 +1,159 @@
+#
+# This file is autogenerated by pip-compile
+# To update, run:
+#
+#    pip-compile -U --generate-hashes --output-file contrib/automation/linux-requirements-py3.txt contrib/automation/linux-requirements.txt.in
+#
+astroid==2.2.5 \
+    --hash=sha256:6560e1e1749f68c64a4b5dee4e091fce798d2f0d84ebe638cf0e0585a343acf4 \
+    --hash=sha256:b65db1bbaac9f9f4d190199bb8680af6f6f84fd3769a5ea883df8a91fe68b4c4 \
+    # via pylint
+docutils==0.14 \
+    --hash=sha256:02aec4bd92ab067f6ff27a38a38a41173bf01bed8f89157768c1573f53e474a6 \
+    --hash=sha256:51e64ef2ebfb29cae1faa133b3710143496eca21c530f3f71424d77687764274 \
+    --hash=sha256:7a4bd47eaf6596e1295ecb11361139febe29b084a87bf005bf899f9a42edc3c6
+fuzzywuzzy==0.17.0 \
+    --hash=sha256:5ac7c0b3f4658d2743aa17da53a55598144edbc5bee3c6863840636e6926f254 \
+    --hash=sha256:6f49de47db00e1c71d40ad16da42284ac357936fa9b66bea1df63fed07122d62
+idna==2.8 \
+    --hash=sha256:c357b3f628cf53ae2c4c05627ecc484553142ca23264e593d327bcde5e9c3407 \
+    --hash=sha256:ea8b7f6188e6fa117537c3df7da9fc686d485087abf6ac197f9c46432f7e4a3c \
+    # via yarl
+isort==4.3.17 \
+    --hash=sha256:01cb7e1ca5e6c5b3f235f0385057f70558b70d2f00320208825fa62887292f43 \
+    --hash=sha256:268067462aed7eb2a1e237fcb287852f22077de3fb07964e87e00f829eea2d1a \
+    # via pylint
+lazy-object-proxy==1.3.1 \
+    --hash=sha256:0ce34342b419bd8f018e6666bfef729aec3edf62345a53b537a4dcc115746a33 \
+    --hash=sha256:1b668120716eb7ee21d8a38815e5eb3bb8211117d9a90b0f8e21722c0758cc39 \
+    --hash=sha256:209615b0fe4624d79e50220ce3310ca1a9445fd8e6d3572a896e7f9146bbf019 \
+    --hash=sha256:27bf62cb2b1a2068d443ff7097ee33393f8483b570b475db8ebf7e1cba64f088 \
+    --hash=sha256:27ea6fd1c02dcc78172a82fc37fcc0992a94e4cecf53cb6d73f11749825bd98b \
+    --hash=sha256:2c1b21b44ac9beb0fc848d3993924147ba45c4ebc24be19825e57aabbe74a99e \
+    --hash=sha256:2df72ab12046a3496a92476020a1a0abf78b2a7db9ff4dc2036b8dd980203ae6 \
+    --hash=sha256:320ffd3de9699d3892048baee45ebfbbf9388a7d65d832d7e580243ade426d2b \
+    --hash=sha256:50e3b9a464d5d08cc5227413db0d1c4707b6172e4d4d915c1c70e4de0bbff1f5 \
+    --hash=sha256:5276db7ff62bb7b52f77f1f51ed58850e315154249aceb42e7f4c611f0f847ff \
+    --hash=sha256:61a6cf00dcb1a7f0c773ed4acc509cb636af2d6337a08f362413c76b2b47a8dd \
+    --hash=sha256:6ae6c4cb59f199d8827c5a07546b2ab7e85d262acaccaacd49b62f53f7c456f7 \
+    --hash=sha256:7661d401d60d8bf15bb5da39e4dd72f5d764c5aff5a86ef52a042506e3e970ff \
+    --hash=sha256:7bd527f36a605c914efca5d3d014170b2cb184723e423d26b1fb2fd9108e264d \
+    --hash=sha256:7cb54db3535c8686ea12e9535eb087d32421184eacc6939ef15ef50f83a5e7e2 \
+    --hash=sha256:7f3a2d740291f7f2c111d86a1c4851b70fb000a6c8883a59660d95ad57b9df35 \
+    --hash=sha256:81304b7d8e9c824d058087dcb89144842c8e0dea6d281c031f59f0acf66963d4 \
+    --hash=sha256:933947e8b4fbe617a51528b09851685138b49d511af0b6c0da2539115d6d4514 \
+    --hash=sha256:94223d7f060301b3a8c09c9b3bc3294b56b2188e7d8179c762a1cda72c979252 \
+    --hash=sha256:ab3ca49afcb47058393b0122428358d2fbe0408cf99f1b58b295cfeb4ed39109 \
+    --hash=sha256:bd6292f565ca46dee4e737ebcc20742e3b5be2b01556dafe169f6c65d088875f \
+    --hash=sha256:cb924aa3e4a3fb644d0c463cad5bc2572649a6a3f68a7f8e4fbe44aaa6d77e4c \
+    --hash=sha256:d0fc7a286feac9077ec52a927fc9fe8fe2fabab95426722be4c953c9a8bede92 \
+    --hash=sha256:ddc34786490a6e4ec0a855d401034cbd1242ef186c20d79d2166d6a4bd449577 \
+    --hash=sha256:e34b155e36fa9da7e1b7c738ed7767fc9491a62ec6af70fe9da4a057759edc2d \
+    --hash=sha256:e5b9e8f6bda48460b7b143c3821b21b452cb3a835e6bbd5dd33aa0c8d3f5137d \
+    --hash=sha256:e81ebf6c5ee9684be8f2c87563880f93eedd56dd2b6146d8a725b50b7e5adb0f \
+    --hash=sha256:eb91be369f945f10d3a49f5f9be8b3d0b93a4c2be8f8a5b83b0571b8123e0a7a \
+    --hash=sha256:f460d1ceb0e4a5dcb2a652db0904224f367c9b3c1470d5a7683c0480e582468b \
+    # via astroid
+mccabe==0.6.1 \
+    --hash=sha256:ab8a6258860da4b6677da4bd2fe5dc2c659cff31b3ee4f7f5d64e79735b80d42 \
+    --hash=sha256:dd8d182285a0fe56bace7f45b5e7d1a6ebcbf524e8f3bd87eb0f125271b8831f \
+    # via pylint
+multidict==4.5.2 \
+    --hash=sha256:024b8129695a952ebd93373e45b5d341dbb87c17ce49637b34000093f243dd4f \
+    --hash=sha256:041e9442b11409be5e4fc8b6a97e4bcead758ab1e11768d1e69160bdde18acc3 \
+    --hash=sha256:045b4dd0e5f6121e6f314d81759abd2c257db4634260abcfe0d3f7083c4908ef \
+    --hash=sha256:047c0a04e382ef8bd74b0de01407e8d8632d7d1b4db6f2561106af812a68741b \
+    --hash=sha256:068167c2d7bbeebd359665ac4fff756be5ffac9cda02375b5c5a7c4777038e73 \
+    --hash=sha256:148ff60e0fffa2f5fad2eb25aae7bef23d8f3b8bdaf947a65cdbe84a978092bc \
+    --hash=sha256:1d1c77013a259971a72ddaa83b9f42c80a93ff12df6a4723be99d858fa30bee3 \
+    --hash=sha256:1d48bc124a6b7a55006d97917f695effa9725d05abe8ee78fd60d6588b8344cd \
+    --hash=sha256:31dfa2fc323097f8ad7acd41aa38d7c614dd1960ac6681745b6da124093dc351 \
+    --hash=sha256:34f82db7f80c49f38b032c5abb605c458bac997a6c3142e0d6c130be6fb2b941 \
+    --hash=sha256:3d5dd8e5998fb4ace04789d1d008e2bb532de501218519d70bb672c4c5a2fc5d \
+    --hash=sha256:4a6ae52bd3ee41ee0f3acf4c60ceb3f44e0e3bc52ab7da1c2b2aa6703363a3d1 \
+    --hash=sha256:4b02a3b2a2f01d0490dd39321c74273fed0568568ea0e7ea23e02bd1fb10a10b \
+    --hash=sha256:4b843f8e1dd6a3195679d9838eb4670222e8b8d01bc36c9894d6c3538316fa0a \
+    --hash=sha256:5de53a28f40ef3c4fd57aeab6b590c2c663de87a5af76136ced519923d3efbb3 \
+    --hash=sha256:61b2b33ede821b94fa99ce0b09c9ece049c7067a33b279f343adfe35108a4ea7 \
+    --hash=sha256:6a3a9b0f45fd75dc05d8e93dc21b18fc1670135ec9544d1ad4acbcf6b86781d0 \
+    --hash=sha256:76ad8e4c69dadbb31bad17c16baee61c0d1a4a73bed2590b741b2e1a46d3edd0 \
+    --hash=sha256:7ba19b777dc00194d1b473180d4ca89a054dd18de27d0ee2e42a103ec9b7d014 \
+    --hash=sha256:7c1b7eab7a49aa96f3db1f716f0113a8a2e93c7375dd3d5d21c4941f1405c9c5 \
+    --hash=sha256:7fc0eee3046041387cbace9314926aa48b681202f8897f8bff3809967a049036 \
+    --hash=sha256:8ccd1c5fff1aa1427100ce188557fc31f1e0a383ad8ec42c559aabd4ff08802d \
+    --hash=sha256:8e08dd76de80539d613654915a2f5196dbccc67448df291e69a88712ea21e24a \
+    --hash=sha256:c18498c50c59263841862ea0501da9f2b3659c00db54abfbf823a80787fde8ce \
+    --hash=sha256:c49db89d602c24928e68c0d510f4fcf8989d77defd01c973d6cbe27e684833b1 \
+    --hash=sha256:ce20044d0317649ddbb4e54dab3c1bcc7483c78c27d3f58ab3d0c7e6bc60d26a \
+    --hash=sha256:d1071414dd06ca2eafa90c85a079169bfeb0e5f57fd0b45d44c092546fcd6fd9 \
+    --hash=sha256:d3be11ac43ab1a3e979dac80843b42226d5d3cccd3986f2e03152720a4297cd7 \
+    --hash=sha256:db603a1c235d110c860d5f39988ebc8218ee028f07a7cbc056ba6424372ca31b \
+    # via yarl
+pyflakes==2.1.1 \
+    --hash=sha256:17dbeb2e3f4d772725c777fabc446d5634d1038f234e77343108ce445ea69ce0 \
+    --hash=sha256:d976835886f8c5b31d47970ed689944a0262b5f3afa00a5a7b4dc81e5449f8a2
+pygments==2.3.1 \
+    --hash=sha256:5ffada19f6203563680669ee7f53b64dabbeb100eb51b61996085e99c03b284a \
+    --hash=sha256:e8218dd399a61674745138520d0d4cf2621d7e032439341bc3f647bff125818d
+pylint==2.3.1 \
+    --hash=sha256:5d77031694a5fb97ea95e828c8d10fc770a1df6eb3906067aaed42201a8a6a09 \
+    --hash=sha256:723e3db49555abaf9bf79dc474c6b9e2935ad82230b10c1138a71ea41ac0fff1
+python-levenshtein==0.12.0 \
+    --hash=sha256:033a11de5e3d19ea25c9302d11224e1a1898fe5abd23c61c7c360c25195e3eb1
+pyyaml==5.1 \
+    --hash=sha256:1adecc22f88d38052fb787d959f003811ca858b799590a5eaa70e63dca50308c \
+    --hash=sha256:436bc774ecf7c103814098159fbb84c2715d25980175292c648f2da143909f95 \
+    --hash=sha256:460a5a4248763f6f37ea225d19d5c205677d8d525f6a83357ca622ed541830c2 \
+    --hash=sha256:5a22a9c84653debfbf198d02fe592c176ea548cccce47553f35f466e15cf2fd4 \
+    --hash=sha256:7a5d3f26b89d688db27822343dfa25c599627bc92093e788956372285c6298ad \
+    --hash=sha256:9372b04a02080752d9e6f990179a4ab840227c6e2ce15b95e1278456664cf2ba \
+    --hash=sha256:a5dcbebee834eaddf3fa7366316b880ff4062e4bcc9787b78c7fbb4a26ff2dd1 \
+    --hash=sha256:aee5bab92a176e7cd034e57f46e9df9a9862a71f8f37cad167c6fc74c65f5b4e \
+    --hash=sha256:c51f642898c0bacd335fc119da60baae0824f2cde95b0330b56c0553439f0673 \
+    --hash=sha256:c68ea4d3ba1705da1e0d85da6684ac657912679a649e8868bd850d2c299cce13 \
+    --hash=sha256:e23d0cc5299223dcc37885dae624f382297717e459ea24053709675a976a3e19 \
+    # via vcrpy
+six==1.12.0 \
+    --hash=sha256:3350809f0555b11f552448330d0b52d5f24c91a322ea4a15ef22629740f3761c \
+    --hash=sha256:d16a0141ec1a18405cd4ce8b4613101da75da0e9a7aec5bdd4fa804d0e0eba73 \
+    # via astroid, vcrpy
+typed-ast==1.3.4 ; python_version >= "3.0" and platform_python_implementation != "PyPy" \
+    --hash=sha256:04894d268ba6eab7e093d43107869ad49e7b5ef40d1a94243ea49b352061b200 \
+    --hash=sha256:16616ece19daddc586e499a3d2f560302c11f122b9c692bc216e821ae32aa0d0 \
+    --hash=sha256:252fdae740964b2d3cdfb3f84dcb4d6247a48a6abe2579e8029ab3be3cdc026c \
+    --hash=sha256:2af80a373af123d0b9f44941a46df67ef0ff7a60f95872412a145f4500a7fc99 \
+    --hash=sha256:2c88d0a913229a06282b285f42a31e063c3bf9071ff65c5ea4c12acb6977c6a7 \
+    --hash=sha256:2ea99c029ebd4b5a308d915cc7fb95b8e1201d60b065450d5d26deb65d3f2bc1 \
+    --hash=sha256:3d2e3ab175fc097d2a51c7a0d3fda442f35ebcc93bb1d7bd9b95ad893e44c04d \
+    --hash=sha256:4766dd695548a15ee766927bf883fb90c6ac8321be5a60c141f18628fb7f8da8 \
+    --hash=sha256:56b6978798502ef66625a2e0f80cf923da64e328da8bbe16c1ff928c70c873de \
+    --hash=sha256:5cddb6f8bce14325b2863f9d5ac5c51e07b71b462361fd815d1d7706d3a9d682 \
+    --hash=sha256:644ee788222d81555af543b70a1098f2025db38eaa99226f3a75a6854924d4db \
+    --hash=sha256:64cf762049fc4775efe6b27161467e76d0ba145862802a65eefc8879086fc6f8 \
+    --hash=sha256:68c362848d9fb71d3c3e5f43c09974a0ae319144634e7a47db62f0f2a54a7fa7 \
+    --hash=sha256:6c1f3c6f6635e611d58e467bf4371883568f0de9ccc4606f17048142dec14a1f \
+    --hash=sha256:b213d4a02eec4ddf622f4d2fbc539f062af3788d1f332f028a2e19c42da53f15 \
+    --hash=sha256:bb27d4e7805a7de0e35bd0cb1411bc85f807968b2b0539597a49a23b00a622ae \
+    --hash=sha256:c9d414512eaa417aadae7758bc118868cd2396b0e6138c1dd4fda96679c079d3 \
+    --hash=sha256:f0937165d1e25477b01081c4763d2d9cdc3b18af69cb259dd4f640c9b900fe5e \
+    --hash=sha256:fb96a6e2c11059ecf84e6741a319f93f683e440e341d4489c9b161eca251cf2a \
+    --hash=sha256:fc71d2d6ae56a091a8d94f33ec9d0f2001d1cb1db423d8b4355debfe9ce689b7
+vcrpy==2.0.1 \
+    --hash=sha256:127e79cf7b569d071d1bd761b83f7b62b2ce2a2eb63ceca7aa67cba8f2602ea3 \
+    --hash=sha256:57be64aa8e9883a4117d0b15de28af62275c001abcdb00b6dc2d4406073d9a4f
+wrapt==1.11.1 \
+    --hash=sha256:4aea003270831cceb8a90ff27c4031da6ead7ec1886023b80ce0dfe0adf61533 \
+    # via astroid, vcrpy
+yarl==1.3.0 \
+    --hash=sha256:024ecdc12bc02b321bc66b41327f930d1c2c543fa9a561b39861da9388ba7aa9 \
+    --hash=sha256:2f3010703295fbe1aec51023740871e64bb9664c789cba5a6bdf404e93f7568f \
+    --hash=sha256:3890ab952d508523ef4881457c4099056546593fa05e93da84c7250516e632eb \
+    --hash=sha256:3e2724eb9af5dc41648e5bb304fcf4891adc33258c6e14e2a7414ea32541e320 \
+    --hash=sha256:5badb97dd0abf26623a9982cd448ff12cb39b8e4c94032ccdedf22ce01a64842 \
+    --hash=sha256:73f447d11b530d860ca1e6b582f947688286ad16ca42256413083d13f260b7a0 \
+    --hash=sha256:7ab825726f2940c16d92aaec7d204cfc34ac26c0040da727cf8ba87255a33829 \
+    --hash=sha256:b25de84a8c20540531526dfbb0e2d2b648c13fd5dd126728c496d7c3fea33310 \
+    --hash=sha256:c6e341f5a6562af74ba55205dbd56d248daf1b5748ec48a0200ba227bb9e33f4 \
+    --hash=sha256:c9bb7c249c4432cd47e75af3864bc02d26c9594f49c82e2a28624417f0ae63b8 \
+    --hash=sha256:e060906c0c585565c718d1c3841747b61c5439af2211e185f6739a9412dfbde1 \
+    # via vcrpy
diff --git a/contrib/automation/linux-requirements-py2.txt b/contrib/automation/linux-requirements-py2.txt
new file mode 100644
--- /dev/null
+++ b/contrib/automation/linux-requirements-py2.txt
@@ -0,0 +1,130 @@
+#
+# This file is autogenerated by pip-compile
+# To update, run:
+#
+#    pip-compile -U --generate-hashes --output-file contrib/automation/linux-requirements-py2.txt contrib/automation/linux-requirements.txt.in
+#
+astroid==1.6.6 \
+    --hash=sha256:87de48a92e29cedf7210ffa853d11441e7ad94cb47bacd91b023499b51cbc756 \
+    --hash=sha256:d25869fc7f44f1d9fb7d24fd7ea0639656f5355fc3089cd1f3d18c6ec6b124c7 \
+    # via pylint
+backports.functools-lru-cache==1.5 \
+    --hash=sha256:9d98697f088eb1b0fa451391f91afb5e3ebde16bbdb272819fd091151fda4f1a \
+    --hash=sha256:f0b0e4eba956de51238e17573b7087e852dfe9854afd2e9c873f73fc0ca0a6dd \
+    # via astroid, isort, pylint
+bzr==2.7.0 ; python_version <= "2.7" and platform_python_implementation == "CPython" \
+    --hash=sha256:c9f6bbe0a50201dadc5fddadd94ba50174193c6cf6e39e16f6dd0ad98a1df338
+configparser==3.7.4 \
+    --hash=sha256:8be81d89d6e7b4c0d4e44bcc525845f6da25821de80cb5e06e7e0238a2899e32 \
+    --hash=sha256:da60d0014fd8c55eb48c1c5354352e363e2d30bbf7057e5e171a468390184c75 \
+    # via pylint
+contextlib2==0.5.5 \
+    --hash=sha256:509f9419ee91cdd00ba34443217d5ca51f5a364a404e1dce9e8979cea969ca48 \
+    --hash=sha256:f5260a6e679d2ff42ec91ec5252f4eeffdcf21053db9113bd0a8e4d953769c00 \
+    # via vcrpy
+docutils==0.14 \
+    --hash=sha256:02aec4bd92ab067f6ff27a38a38a41173bf01bed8f89157768c1573f53e474a6 \
+    --hash=sha256:51e64ef2ebfb29cae1faa133b3710143496eca21c530f3f71424d77687764274 \
+    --hash=sha256:7a4bd47eaf6596e1295ecb11361139febe29b084a87bf005bf899f9a42edc3c6
+enum34==1.1.6 \
+    --hash=sha256:2d81cbbe0e73112bdfe6ef8576f2238f2ba27dd0d55752a776c41d38b7da2850 \
+    --hash=sha256:644837f692e5f550741432dd3f223bbb9852018674981b1664e5dc339387588a \
+    --hash=sha256:6bd0f6ad48ec2aa117d3d141940d484deccda84d4fcd884f5c3d93c23ecd8c79 \
+    --hash=sha256:8ad8c4783bf61ded74527bffb48ed9b54166685e4230386a9ed9b1279e2df5b1 \
+    # via astroid
+funcsigs==1.0.2 \
+    --hash=sha256:330cc27ccbf7f1e992e69fef78261dc7c6569012cf397db8d3de0234e6c937ca \
+    --hash=sha256:a7bb0f2cf3a3fd1ab2732cb49eba4252c2af4240442415b4abce3b87022a8f50 \
+    # via mock
+futures==3.2.0 \
+    --hash=sha256:9ec02aa7d674acb8618afb127e27fde7fc68994c0437ad759fa094a574adb265 \
+    --hash=sha256:ec0a6cb848cc212002b9828c3e34c675e0c9ff6741dc445cab6fdd4e1085d1f1 \
+    # via isort
+fuzzywuzzy==0.17.0 \
+    --hash=sha256:5ac7c0b3f4658d2743aa17da53a55598144edbc5bee3c6863840636e6926f254 \
+    --hash=sha256:6f49de47db00e1c71d40ad16da42284ac357936fa9b66bea1df63fed07122d62
+isort==4.3.17 \
+    --hash=sha256:01cb7e1ca5e6c5b3f235f0385057f70558b70d2f00320208825fa62887292f43 \
+    --hash=sha256:268067462aed7eb2a1e237fcb287852f22077de3fb07964e87e00f829eea2d1a \
+    # via pylint
+lazy-object-proxy==1.3.1 \
+    --hash=sha256:0ce34342b419bd8f018e6666bfef729aec3edf62345a53b537a4dcc115746a33 \
+    --hash=sha256:1b668120716eb7ee21d8a38815e5eb3bb8211117d9a90b0f8e21722c0758cc39 \
+    --hash=sha256:209615b0fe4624d79e50220ce3310ca1a9445fd8e6d3572a896e7f9146bbf019 \
+    --hash=sha256:27bf62cb2b1a2068d443ff7097ee33393f8483b570b475db8ebf7e1cba64f088 \
+    --hash=sha256:27ea6fd1c02dcc78172a82fc37fcc0992a94e4cecf53cb6d73f11749825bd98b \
+    --hash=sha256:2c1b21b44ac9beb0fc848d3993924147ba45c4ebc24be19825e57aabbe74a99e \
+    --hash=sha256:2df72ab12046a3496a92476020a1a0abf78b2a7db9ff4dc2036b8dd980203ae6 \
+    --hash=sha256:320ffd3de9699d3892048baee45ebfbbf9388a7d65d832d7e580243ade426d2b \
+    --hash=sha256:50e3b9a464d5d08cc5227413db0d1c4707b6172e4d4d915c1c70e4de0bbff1f5 \
+    --hash=sha256:5276db7ff62bb7b52f77f1f51ed58850e315154249aceb42e7f4c611f0f847ff \
+    --hash=sha256:61a6cf00dcb1a7f0c773ed4acc509cb636af2d6337a08f362413c76b2b47a8dd \
+    --hash=sha256:6ae6c4cb59f199d8827c5a07546b2ab7e85d262acaccaacd49b62f53f7c456f7 \
+    --hash=sha256:7661d401d60d8bf15bb5da39e4dd72f5d764c5aff5a86ef52a042506e3e970ff \
+    --hash=sha256:7bd527f36a605c914efca5d3d014170b2cb184723e423d26b1fb2fd9108e264d \
+    --hash=sha256:7cb54db3535c8686ea12e9535eb087d32421184eacc6939ef15ef50f83a5e7e2 \
+    --hash=sha256:7f3a2d740291f7f2c111d86a1c4851b70fb000a6c8883a59660d95ad57b9df35 \
+    --hash=sha256:81304b7d8e9c824d058087dcb89144842c8e0dea6d281c031f59f0acf66963d4 \
+    --hash=sha256:933947e8b4fbe617a51528b09851685138b49d511af0b6c0da2539115d6d4514 \
+    --hash=sha256:94223d7f060301b3a8c09c9b3bc3294b56b2188e7d8179c762a1cda72c979252 \
+    --hash=sha256:ab3ca49afcb47058393b0122428358d2fbe0408cf99f1b58b295cfeb4ed39109 \
+    --hash=sha256:bd6292f565ca46dee4e737ebcc20742e3b5be2b01556dafe169f6c65d088875f \
+    --hash=sha256:cb924aa3e4a3fb644d0c463cad5bc2572649a6a3f68a7f8e4fbe44aaa6d77e4c \
+    --hash=sha256:d0fc7a286feac9077ec52a927fc9fe8fe2fabab95426722be4c953c9a8bede92 \
+    --hash=sha256:ddc34786490a6e4ec0a855d401034cbd1242ef186c20d79d2166d6a4bd449577 \
+    --hash=sha256:e34b155e36fa9da7e1b7c738ed7767fc9491a62ec6af70fe9da4a057759edc2d \
+    --hash=sha256:e5b9e8f6bda48460b7b143c3821b21b452cb3a835e6bbd5dd33aa0c8d3f5137d \
+    --hash=sha256:e81ebf6c5ee9684be8f2c87563880f93eedd56dd2b6146d8a725b50b7e5adb0f \
+    --hash=sha256:eb91be369f945f10d3a49f5f9be8b3d0b93a4c2be8f8a5b83b0571b8123e0a7a \
+    --hash=sha256:f460d1ceb0e4a5dcb2a652db0904224f367c9b3c1470d5a7683c0480e582468b \
+    # via astroid
+mccabe==0.6.1 \
+    --hash=sha256:ab8a6258860da4b6677da4bd2fe5dc2c659cff31b3ee4f7f5d64e79735b80d42 \
+    --hash=sha256:dd8d182285a0fe56bace7f45b5e7d1a6ebcbf524e8f3bd87eb0f125271b8831f \
+    # via pylint
+mock==2.0.0 \
+    --hash=sha256:5ce3c71c5545b472da17b72268978914d0252980348636840bd34a00b5cc96c1 \
+    --hash=sha256:b158b6df76edd239b8208d481dc46b6afd45a846b7812ff0ce58971cf5bc8bba \
+    # via vcrpy
+pbr==5.1.3 \
+    --hash=sha256:8257baf496c8522437e8a6cfe0f15e00aedc6c0e0e7c9d55eeeeab31e0853843 \
+    --hash=sha256:8c361cc353d988e4f5b998555c88098b9d5964c2e11acf7b0d21925a66bb5824 \
+    # via mock
+pyflakes==2.1.1 \
+    --hash=sha256:17dbeb2e3f4d772725c777fabc446d5634d1038f234e77343108ce445ea69ce0 \
+    --hash=sha256:d976835886f8c5b31d47970ed689944a0262b5f3afa00a5a7b4dc81e5449f8a2
+pygments==2.3.1 \
+    --hash=sha256:5ffada19f6203563680669ee7f53b64dabbeb100eb51b61996085e99c03b284a \
+    --hash=sha256:e8218dd399a61674745138520d0d4cf2621d7e032439341bc3f647bff125818d
+pylint==1.9.4 \
+    --hash=sha256:02c2b6d268695a8b64ad61847f92e611e6afcff33fd26c3a2125370c4662905d \
+    --hash=sha256:ee1e85575587c5b58ddafa25e1c1b01691ef172e139fc25585e5d3f02451da93
+python-levenshtein==0.12.0 \
+    --hash=sha256:033a11de5e3d19ea25c9302d11224e1a1898fe5abd23c61c7c360c25195e3eb1
+pyyaml==5.1 \
+    --hash=sha256:1adecc22f88d38052fb787d959f003811ca858b799590a5eaa70e63dca50308c \
+    --hash=sha256:436bc774ecf7c103814098159fbb84c2715d25980175292c648f2da143909f95 \
+    --hash=sha256:460a5a4248763f6f37ea225d19d5c205677d8d525f6a83357ca622ed541830c2 \
+    --hash=sha256:5a22a9c84653debfbf198d02fe592c176ea548cccce47553f35f466e15cf2fd4 \
+    --hash=sha256:7a5d3f26b89d688db27822343dfa25c599627bc92093e788956372285c6298ad \
+    --hash=sha256:9372b04a02080752d9e6f990179a4ab840227c6e2ce15b95e1278456664cf2ba \
+    --hash=sha256:a5dcbebee834eaddf3fa7366316b880ff4062e4bcc9787b78c7fbb4a26ff2dd1 \
+    --hash=sha256:aee5bab92a176e7cd034e57f46e9df9a9862a71f8f37cad167c6fc74c65f5b4e \
+    --hash=sha256:c51f642898c0bacd335fc119da60baae0824f2cde95b0330b56c0553439f0673 \
+    --hash=sha256:c68ea4d3ba1705da1e0d85da6684ac657912679a649e8868bd850d2c299cce13 \
+    --hash=sha256:e23d0cc5299223dcc37885dae624f382297717e459ea24053709675a976a3e19 \
+    # via vcrpy
+singledispatch==3.4.0.3 \
+    --hash=sha256:5b06af87df13818d14f08a028e42f566640aef80805c3b50c5056b086e3c2b9c \
+    --hash=sha256:833b46966687b3de7f438c761ac475213e53b306740f1abfaa86e1d1aae56aa8 \
+    # via astroid, pylint
+six==1.12.0 \
+    --hash=sha256:3350809f0555b11f552448330d0b52d5f24c91a322ea4a15ef22629740f3761c \
+    --hash=sha256:d16a0141ec1a18405cd4ce8b4613101da75da0e9a7aec5bdd4fa804d0e0eba73 \
+    # via astroid, mock, pylint, singledispatch, vcrpy
+vcrpy==2.0.1 \
+    --hash=sha256:127e79cf7b569d071d1bd761b83f7b62b2ce2a2eb63ceca7aa67cba8f2602ea3 \
+    --hash=sha256:57be64aa8e9883a4117d0b15de28af62275c001abcdb00b6dc2d4406073d9a4f
+wrapt==1.11.1 \
+    --hash=sha256:4aea003270831cceb8a90ff27c4031da6ead7ec1886023b80ce0dfe0adf61533 \
+    # via astroid, vcrpy
diff --git a/contrib/automation/hgautomation/ssh.py b/contrib/automation/hgautomation/ssh.py
new file mode 100644
--- /dev/null
+++ b/contrib/automation/hgautomation/ssh.py
@@ -0,0 +1,67 @@
+# ssh.py - Interact with remote SSH servers
+#
+# Copyright 2019 Gregory Szorc <gregory.szorc at gmail.com>
+#
+# This software may be used and distributed according to the terms of the
+# GNU General Public License version 2 or any later version.
+
+# no-check-code because Python 3 native.
+
+import socket
+import time
+import warnings
+
+from cryptography.utils import (
+    CryptographyDeprecationWarning,
+)
+import paramiko
+
+
+def wait_for_ssh(hostname, port, timeout=60, username=None, key_filename=None):
+    """Wait for an SSH server to start on the specified host and port."""
+    class IgnoreHostKeyPolicy(paramiko.MissingHostKeyPolicy):
+        def missing_host_key(self, client, hostname, key):
+            return
+
+    end_time = time.time() + timeout
+
+    # paramiko triggers a CryptographyDeprecationWarning in the cryptography
+    # package. Let's suppress
+    with warnings.catch_warnings():
+        warnings.filterwarnings('ignore',
+                                category=CryptographyDeprecationWarning)
+
+        while True:
+            client = paramiko.SSHClient()
+            client.set_missing_host_key_policy(IgnoreHostKeyPolicy())
+            try:
+                client.connect(hostname, port=port, username=username,
+                               key_filename=key_filename,
+                               timeout=5.0, allow_agent=False,
+                               look_for_keys=False)
+
+                return client
+            except socket.error:
+                pass
+            except paramiko.AuthenticationException:
+                raise
+            except paramiko.SSHException:
+                pass
+
+            if time.time() >= end_time:
+                raise Exception('Timeout reached waiting for SSH')
+
+            time.sleep(1.0)
+
+
+def exec_command(client, command):
+    """exec_command wrapper that combines stderr/stdout and returns channel"""
+    chan = client.get_transport().open_session()
+
+    chan.exec_command(command)
+    chan.set_combine_stderr(True)
+
+    stdin = chan.makefile('wb', -1)
+    stdout = chan.makefile('r', -1)
+
+    return chan, stdin, stdout
diff --git a/contrib/automation/hgautomation/linux.py b/contrib/automation/hgautomation/linux.py
new file mode 100644
--- /dev/null
+++ b/contrib/automation/hgautomation/linux.py
@@ -0,0 +1,545 @@
+# linux.py - Linux specific automation functionality
+#
+# Copyright 2019 Gregory Szorc <gregory.szorc at gmail.com>
+#
+# This software may be used and distributed according to the terms of the
+# GNU General Public License version 2 or any later version.
+
+# no-check-code because Python 3 native.
+
+import os
+import pathlib
+import shlex
+import subprocess
+import tempfile
+
+from .ssh import (
+    exec_command,
+)
+
+
+# Linux distributions that are supported.
+DISTROS = {
+    'debian9',
+    'ubuntu18.04',
+    'ubuntu18.10',
+    'ubuntu19.04',
+}
+
+INSTALL_PYTHONS = r'''
+PYENV2_VERSIONS="2.7.16 pypy2.7-7.1.1"
+PYENV3_VERSIONS="3.5.7 3.6.8 3.7.3 3.8-dev pypy3.5-7.0.0 pypy3.6-7.1.1"
+
+git clone https://github.com/pyenv/pyenv.git /hgdev/pyenv
+pushd /hgdev/pyenv
+git checkout 3faeda67bb33e07750d1a104271369a7384ca45c
+popd
+
+export PYENV_ROOT="/hgdev/pyenv"
+export PATH="$PYENV_ROOT/bin:$PATH"
+
+# pip 19.0.3.
+PIP_SHA256=efe99298f3fbb1f56201ce6b81d2658067d2f7d7dfc2d412e0d3cacc9a397c61
+wget -O get-pip.py --progress dot:mega https://github.com/pypa/get-pip/raw/fee32c376da1ff6496a798986d7939cd51e1644f/get-pip.py
+echo "${PIP_SHA256} get-pip.py" | sha256sum --check -
+
+VIRTUALENV_SHA256=984d7e607b0a5d1329425dd8845bd971b957424b5ba664729fab51ab8c11bc39
+VIRTUALENV_TARBALL=virtualenv-16.4.3.tar.gz
+wget -O ${VIRTUALENV_TARBALL} --progress dot:mega https://files.pythonhosted.org/packages/37/db/89d6b043b22052109da35416abc3c397655e4bd3cff031446ba02b9654fa/${VIRTUALENV_TARBALL}
+echo "${VIRTUALENV_SHA256} ${VIRTUALENV_TARBALL}" | sha256sum --check -
+
+for v in ${PYENV2_VERSIONS}; do
+    pyenv install -v ${v}
+    ${PYENV_ROOT}/versions/${v}/bin/python get-pip.py
+    ${PYENV_ROOT}/versions/${v}/bin/pip install ${VIRTUALENV_TARBALL}
+    ${PYENV_ROOT}/versions/${v}/bin/pip install -r /hgdev/requirements-py2.txt
+done
+
+for v in ${PYENV3_VERSIONS}; do
+    pyenv install -v ${v}
+    ${PYENV_ROOT}/versions/${v}/bin/python get-pip.py
+    ${PYENV_ROOT}/versions/${v}/bin/pip install -r /hgdev/requirements-py3.txt
+done
+
+pyenv global ${PYENV2_VERSIONS} ${PYENV3_VERSIONS} system
+'''.lstrip().replace('\r\n', '\n')
+
+
+BOOTSTRAP_VIRTUALENV = r'''
+/usr/bin/virtualenv /hgdev/venv-bootstrap
+
+HG_SHA256=1bdd21bb87d1e05fb5cd395d488d0e0cc2f2f90ce0fd248e31a03595da5ccb47
+HG_TARBALL=mercurial-4.9.1.tar.gz
+
+wget -O ${HG_TARBALL} --progress dot:mega https://www.mercurial-scm.org/release/${HG_TARBALL}
+echo "${HG_SHA256} ${HG_TARBALL}" | sha256sum --check -
+
+/hgdev/venv-bootstrap/bin/pip install ${HG_TARBALL}
+'''.lstrip().replace('\r\n', '\n')
+
+
+BOOTSTRAP_DEBIAN = r'''
+#!/bin/bash
+
+set -ex
+
+DISTRO=`grep DISTRIB_ID /etc/lsb-release  | awk -F= '{{print $2}}'`
+DEBIAN_VERSION=`cat /etc/debian_version`
+LSB_RELEASE=`lsb_release -cs`
+
+sudo /usr/sbin/groupadd hg
+sudo /usr/sbin/groupadd docker
+sudo /usr/sbin/useradd -g hg -G sudo,docker -d /home/hg -m -s /bin/bash hg
+sudo mkdir /home/hg/.ssh
+sudo cp ~/.ssh/authorized_keys /home/hg/.ssh/authorized_keys
+sudo chown -R hg:hg /home/hg/.ssh
+sudo chmod 700 /home/hg/.ssh
+sudo chmod 600 /home/hg/.ssh/authorized_keys
+
+cat << EOF | sudo tee /etc/sudoers.d/90-hg
+hg ALL=(ALL) NOPASSWD:ALL
+EOF
+
+sudo apt-get update
+sudo DEBIAN_FRONTEND=noninteractive apt-get -yq dist-upgrade
+
+# Install packages necessary to set up Docker Apt repo.
+sudo DEBIAN_FRONTEND=noninteractive apt-get -yq install --no-install-recommends \
+    apt-transport-https \
+    gnupg
+
+cat > docker-apt-key << EOF
+-----BEGIN PGP PUBLIC KEY BLOCK-----
+
+mQINBFit2ioBEADhWpZ8/wvZ6hUTiXOwQHXMAlaFHcPH9hAtr4F1y2+OYdbtMuth
+lqqwp028AqyY+PRfVMtSYMbjuQuu5byyKR01BbqYhuS3jtqQmljZ/bJvXqnmiVXh
+38UuLa+z077PxyxQhu5BbqntTPQMfiyqEiU+BKbq2WmANUKQf+1AmZY/IruOXbnq
+L4C1+gJ8vfmXQt99npCaxEjaNRVYfOS8QcixNzHUYnb6emjlANyEVlZzeqo7XKl7
+UrwV5inawTSzWNvtjEjj4nJL8NsLwscpLPQUhTQ+7BbQXAwAmeHCUTQIvvWXqw0N
+cmhh4HgeQscQHYgOJjjDVfoY5MucvglbIgCqfzAHW9jxmRL4qbMZj+b1XoePEtht
+ku4bIQN1X5P07fNWzlgaRL5Z4POXDDZTlIQ/El58j9kp4bnWRCJW0lya+f8ocodo
+vZZ+Doi+fy4D5ZGrL4XEcIQP/Lv5uFyf+kQtl/94VFYVJOleAv8W92KdgDkhTcTD
+G7c0tIkVEKNUq48b3aQ64NOZQW7fVjfoKwEZdOqPE72Pa45jrZzvUFxSpdiNk2tZ
+XYukHjlxxEgBdC/J3cMMNRE1F4NCA3ApfV1Y7/hTeOnmDuDYwr9/obA8t016Yljj
+q5rdkywPf4JF8mXUW5eCN1vAFHxeg9ZWemhBtQmGxXnw9M+z6hWwc6ahmwARAQAB
+tCtEb2NrZXIgUmVsZWFzZSAoQ0UgZGViKSA8ZG9ja2VyQGRvY2tlci5jb20+iQI3
+BBMBCgAhBQJYrefAAhsvBQsJCAcDBRUKCQgLBRYCAwEAAh4BAheAAAoJEI2BgDwO
+v82IsskP/iQZo68flDQmNvn8X5XTd6RRaUH33kXYXquT6NkHJciS7E2gTJmqvMqd
+tI4mNYHCSEYxI5qrcYV5YqX9P6+Ko+vozo4nseUQLPH/ATQ4qL0Zok+1jkag3Lgk
+jonyUf9bwtWxFp05HC3GMHPhhcUSexCxQLQvnFWXD2sWLKivHp2fT8QbRGeZ+d3m
+6fqcd5Fu7pxsqm0EUDK5NL+nPIgYhN+auTrhgzhK1CShfGccM/wfRlei9Utz6p9P
+XRKIlWnXtT4qNGZNTN0tR+NLG/6Bqd8OYBaFAUcue/w1VW6JQ2VGYZHnZu9S8LMc
+FYBa5Ig9PxwGQOgq6RDKDbV+PqTQT5EFMeR1mrjckk4DQJjbxeMZbiNMG5kGECA8
+g383P3elhn03WGbEEa4MNc3Z4+7c236QI3xWJfNPdUbXRaAwhy/6rTSFbzwKB0Jm
+ebwzQfwjQY6f55MiI/RqDCyuPj3r3jyVRkK86pQKBAJwFHyqj9KaKXMZjfVnowLh
+9svIGfNbGHpucATqREvUHuQbNnqkCx8VVhtYkhDb9fEP2xBu5VvHbR+3nfVhMut5
+G34Ct5RS7Jt6LIfFdtcn8CaSas/l1HbiGeRgc70X/9aYx/V/CEJv0lIe8gP6uDoW
+FPIZ7d6vH+Vro6xuWEGiuMaiznap2KhZmpkgfupyFmplh0s6knymuQINBFit2ioB
+EADneL9S9m4vhU3blaRjVUUyJ7b/qTjcSylvCH5XUE6R2k+ckEZjfAMZPLpO+/tF
+M2JIJMD4SifKuS3xck9KtZGCufGmcwiLQRzeHF7vJUKrLD5RTkNi23ydvWZgPjtx
+Q+DTT1Zcn7BrQFY6FgnRoUVIxwtdw1bMY/89rsFgS5wwuMESd3Q2RYgb7EOFOpnu
+w6da7WakWf4IhnF5nsNYGDVaIHzpiqCl+uTbf1epCjrOlIzkZ3Z3Yk5CM/TiFzPk
+z2lLz89cpD8U+NtCsfagWWfjd2U3jDapgH+7nQnCEWpROtzaKHG6lA3pXdix5zG8
+eRc6/0IbUSWvfjKxLLPfNeCS2pCL3IeEI5nothEEYdQH6szpLog79xB9dVnJyKJb
+VfxXnseoYqVrRz2VVbUI5Blwm6B40E3eGVfUQWiux54DspyVMMk41Mx7QJ3iynIa
+1N4ZAqVMAEruyXTRTxc9XW0tYhDMA/1GYvz0EmFpm8LzTHA6sFVtPm/ZlNCX6P1X
+zJwrv7DSQKD6GGlBQUX+OeEJ8tTkkf8QTJSPUdh8P8YxDFS5EOGAvhhpMBYD42kQ
+pqXjEC+XcycTvGI7impgv9PDY1RCC1zkBjKPa120rNhv/hkVk/YhuGoajoHyy4h7
+ZQopdcMtpN2dgmhEegny9JCSwxfQmQ0zK0g7m6SHiKMwjwARAQABiQQ+BBgBCAAJ
+BQJYrdoqAhsCAikJEI2BgDwOv82IwV0gBBkBCAAGBQJYrdoqAAoJEH6gqcPyc/zY
+1WAP/2wJ+R0gE6qsce3rjaIz58PJmc8goKrir5hnElWhPgbq7cYIsW5qiFyLhkdp
+YcMmhD9mRiPpQn6Ya2w3e3B8zfIVKipbMBnke/ytZ9M7qHmDCcjoiSmwEXN3wKYI
+mD9VHONsl/CG1rU9Isw1jtB5g1YxuBA7M/m36XN6x2u+NtNMDB9P56yc4gfsZVES
+KA9v+yY2/l45L8d/WUkUi0YXomn6hyBGI7JrBLq0CX37GEYP6O9rrKipfz73XfO7
+JIGzOKZlljb/D9RX/g7nRbCn+3EtH7xnk+TK/50euEKw8SMUg147sJTcpQmv6UzZ
+cM4JgL0HbHVCojV4C/plELwMddALOFeYQzTif6sMRPf+3DSj8frbInjChC3yOLy0
+6br92KFom17EIj2CAcoeq7UPhi2oouYBwPxh5ytdehJkoo+sN7RIWua6P2WSmon5
+U888cSylXC0+ADFdgLX9K2zrDVYUG1vo8CX0vzxFBaHwN6Px26fhIT1/hYUHQR1z
+VfNDcyQmXqkOnZvvoMfz/Q0s9BhFJ/zU6AgQbIZE/hm1spsfgvtsD1frZfygXJ9f
+irP+MSAI80xHSf91qSRZOj4Pl3ZJNbq4yYxv0b1pkMqeGdjdCYhLU+LZ4wbQmpCk
+SVe2prlLureigXtmZfkqevRz7FrIZiu9ky8wnCAPwC7/zmS18rgP/17bOtL4/iIz
+QhxAAoAMWVrGyJivSkjhSGx1uCojsWfsTAm11P7jsruIL61ZzMUVE2aM3Pmj5G+W
+9AcZ58Em+1WsVnAXdUR//bMmhyr8wL/G1YO1V3JEJTRdxsSxdYa4deGBBY/Adpsw
+24jxhOJR+lsJpqIUeb999+R8euDhRHG9eFO7DRu6weatUJ6suupoDTRWtr/4yGqe
+dKxV3qQhNLSnaAzqW/1nA3iUB4k7kCaKZxhdhDbClf9P37qaRW467BLCVO/coL3y
+Vm50dwdrNtKpMBh3ZpbB1uJvgi9mXtyBOMJ3v8RZeDzFiG8HdCtg9RvIt/AIFoHR
+H3S+U79NT6i0KPzLImDfs8T7RlpyuMc4Ufs8ggyg9v3Ae6cN3eQyxcK3w0cbBwsh
+/nQNfsA6uu+9H7NhbehBMhYnpNZyrHzCmzyXkauwRAqoCbGCNykTRwsur9gS41TQ
+M8ssD1jFheOJf3hODnkKU+HKjvMROl1DK7zdmLdNzA1cvtZH/nCC9KPj1z8QC47S
+xx+dTZSx4ONAhwbS/LN3PoKtn8LPjY9NP9uDWI+TWYquS2U+KHDrBDlsgozDbs/O
+jCxcpDzNmXpWQHEtHU7649OXHP7UeNST1mCUCH5qdank0V1iejF6/CfTFU4MfcrG
+YT90qFF93M3v01BbxP+EIY2/9tiIPbrd
+=0YYh
+-----END PGP PUBLIC KEY BLOCK-----
+EOF
+
+sudo apt-key add docker-apt-key
+
+if [ "$DEBIAN_VERSION" = "9.8" ]; then
+cat << EOF | sudo tee -a /etc/apt/sources.list
+# Need backports for clang-format-6.0
+deb http://deb.debian.org/debian stretch-backports main
+
+# Sources are useful if we want to compile things locally.
+deb-src http://deb.debian.org/debian stretch main
+deb-src http://security.debian.org/debian-security stretch/updates main
+deb-src http://deb.debian.org/debian stretch-updates main
+deb-src http://deb.debian.org/debian stretch-backports main
+
+deb [arch=amd64] https://download.docker.com/linux/debian stretch stable
+EOF
+
+elif [ "$DISTRO" = "Ubuntu" ]; then
+cat << EOF | sudo tee -a /etc/apt/sources.list
+deb [arch=amd64] https://download.docker.com/linux/ubuntu $LSB_RELEASE stable
+EOF
+
+fi
+
+sudo apt-get update
+
+PACKAGES="\
+    btrfs-progs \
+    build-essential \
+    bzr \
+    clang-format-6.0 \
+    cvs \
+    darcs \
+    debhelper \
+    devscripts \
+    dpkg-dev \
+    dstat \
+    emacs \
+    gettext \
+    git \
+    htop \
+    iotop \
+    jfsutils \
+    libbz2-dev \
+    libexpat1-dev \
+    libffi-dev \
+    libgdbm-dev \
+    liblzma-dev \
+    libncurses5-dev \
+    libnss3-dev \
+    libreadline-dev \
+    libsqlite3-dev \
+    libssl-dev \
+    netbase \
+    ntfs-3g \
+    nvme-cli \
+    pyflakes \
+    pyflakes3 \
+    pylint \
+    pylint3 \
+    python-all-dev \
+    python-dev \
+    python-docutils \
+    python-fuzzywuzzy \
+    python-pygments \
+    python-subversion \
+    python-vcr \
+    python3-dev \
+    python3-docutils \
+    python3-fuzzywuzzy \
+    python3-pygments \
+    python3-vcr \
+    rsync \
+    sqlite3 \
+    subversion \
+    tcl-dev \
+    tk-dev \
+    tla \
+    unzip \
+    uuid-dev \
+    vim \
+    virtualenv \
+    wget \
+    xfsprogs \
+    zip \
+    zlib1g-dev"
+
+if [ "$DEBIAN_VERSION" = "9.8" ]; then
+    PACKAGES="$PACKAGES linux-perf"
+elif [ "$DISTRO" = "Ubuntu" ]; then
+    PACKAGES="$PACKAGES linux-tools-common"
+fi
+
+# Ubuntu 19.04 removes monotone.
+if [ "$LSB_RELEASE" != "disco" ]; then
+    PACKAGES="$PACKAGES monotone"
+fi
+
+# As of April 27, 2019, Docker hasn't published packages for
+# Ubuntu 19.04 yet.
+if [ "$LSB_RELEASE" != "disco" ]; then
+    PACKAGES="$PACKAGES docker-ce"
+fi
+
+sudo DEBIAN_FRONTEND=noninteractive apt-get -yq install --no-install-recommends $PACKAGES
+
+# Create clang-format symlink so test harness finds it.
+sudo update-alternatives --install /usr/bin/clang-format clang-format \
+    /usr/bin/clang-format-6.0 1000
+
+sudo mkdir /hgdev
+# Will be normalized to hg:hg later.
+sudo chown `whoami` /hgdev
+
+cp requirements-py2.txt /hgdev/requirements-py2.txt
+cp requirements-py3.txt /hgdev/requirements-py3.txt
+
+# Disable the pip version check because it uses the network and can
+# be annoying.
+cat << EOF | sudo tee -a /etc/pip.conf
+[global]
+disable-pip-version-check = True
+EOF
+
+{install_pythons}
+{bootstrap_virtualenv}
+
+/hgdev/venv-bootstrap/bin/hg clone https://www.mercurial-scm.org/repo/hg /hgdev/src
+
+# Mark the repo as non-publishing.
+cat >> /hgdev/src/.hg/hgrc << EOF
+[phases]
+publish = false
+EOF
+
+sudo chown -R hg:hg /hgdev
+'''.lstrip().format(
+    install_pythons=INSTALL_PYTHONS,
+    bootstrap_virtualenv=BOOTSTRAP_VIRTUALENV
+).replace('\r\n', '\n')
+
+
+# Prepares /hgdev for operations.
+PREPARE_HGDEV = '''
+#!/bin/bash
+
+set -e
+
+FS=$1
+
+ensure_device() {
+    if [ -z "${DEVICE}" ]; then
+        echo "could not find block device to format"
+        exit 1
+    fi
+}
+
+# Determine device to partition for extra filesystem.
+# If only 1 volume is present, it will be the root volume and
+# should be /dev/nvme0. If multiple volumes are present, the
+# root volume could be nvme0 or nvme1. Use whichever one doesn't have
+# a partition.
+if [ -e /dev/nvme1n1 ]; then
+    if [ -e /dev/nvme0n1p1 ]; then
+        DEVICE=/dev/nvme1n1
+    else
+        DEVICE=/dev/nvme0n1
+    fi
+else
+    DEVICE=
+fi
+
+sudo mkdir /hgwork
+
+if [ "${FS}" != "default" -a "${FS}" != "tmpfs" ]; then
+    ensure_device
+    echo "creating ${FS} filesystem on ${DEVICE}"
+fi
+
+if [ "${FS}" = "default" ]; then
+    :
+
+elif [ "${FS}" = "btrfs" ]; then
+    sudo mkfs.btrfs ${DEVICE}
+    sudo mount ${DEVICE} /hgwork
+
+elif [ "${FS}" = "ext3" ]; then
+    # lazy_journal_init speeds up filesystem creation at the expense of
+    # integrity if things crash. We are an ephemeral instance, so we don't
+    # care about integrity.
+    sudo mkfs.ext3 -E lazy_journal_init=1 ${DEVICE}
+    sudo mount ${DEVICE} /hgwork
+
+elif [ "${FS}" = "ext4" ]; then
+    sudo mkfs.ext4 -E lazy_journal_init=1 ${DEVICE}
+    sudo mount ${DEVICE} /hgwork
+
+elif [ "${FS}" = "jfs" ]; then
+    sudo mkfs.jfs ${DEVICE}
+    sudo mount ${DEVICE} /hgwork
+
+elif [ "${FS}" = "tmpfs" ]; then
+    echo "creating tmpfs volume in /hgwork"
+    sudo mount -t tmpfs -o size=1024M tmpfs /hgwork
+
+elif [ "${FS}" = "xfs" ]; then
+    sudo mkfs.xfs ${DEVICE}
+    sudo mount ${DEVICE} /hgwork
+
+else
+    echo "unsupported filesystem: ${FS}"
+    exit 1
+fi
+
+echo "/hgwork ready"
+
+sudo chown hg:hg /hgwork
+mkdir /hgwork/tmp
+chown hg:hg /hgwork/tmp
+
+rsync -a /hgdev/src /hgwork/
+'''.lstrip().replace('\r\n', '\n')
+
+
+HG_UPDATE_CLEAN = '''
+set -ex
+
+HG=/hgdev/venv-bootstrap/bin/hg
+
+cd /hgwork/src
+${HG} --config extensions.purge= purge --all
+${HG} update -C $1
+${HG} log -r .
+'''.lstrip().replace('\r\n', '\n')
+
+
+def prepare_exec_environment(ssh_client, filesystem='default'):
+    """Prepare an EC2 instance to execute things.
+
+    The AMI has an ``/hgdev`` bootstrapped with various Python installs
+    and a clone of the Mercurial repo.
+
+    In EC2, EBS volumes launched from snapshots have wonky performance behavior.
+    Notably, blocks have to be copied on first access, which makes volume
+    I/O extremely slow on fresh volumes.
+
+    Furthermore, we may want to run operations, tests, etc on alternative
+    filesystems so we examine behavior on different filesystems.
+
+    This function is used to facilitate executing operations on alternate
+    volumes.
+    """
+    sftp = ssh_client.open_sftp()
+
+    with sftp.open('/hgdev/prepare-hgdev', 'wb') as fh:
+        fh.write(PREPARE_HGDEV)
+        fh.chmod(0o0777)
+
+    command = 'sudo /hgdev/prepare-hgdev %s' % filesystem
+    chan, stdin, stdout = exec_command(ssh_client, command)
+    stdin.close()
+
+    for line in stdout:
+        print(line, end='')
+
+    res = chan.recv_exit_status()
+
+    if res:
+        raise Exception('non-0 exit code updating working directory; %d'
+                        % res)
+
+
+def synchronize_hg(source_path: pathlib.Path, ec2_instance, revision: str=None):
+    """Synchronize a local Mercurial source path to remote EC2 instance."""
+
+    with tempfile.TemporaryDirectory() as temp_dir:
+        temp_dir = pathlib.Path(temp_dir)
+
+        ssh_dir = temp_dir / '.ssh'
+        ssh_dir.mkdir()
+        ssh_dir.chmod(0o0700)
+
+        public_ip = ec2_instance.public_ip_address
+
+        ssh_config = ssh_dir / 'config'
+
+        with ssh_config.open('w', encoding='utf-8') as fh:
+            fh.write('Host %s\n' % public_ip)
+            fh.write('  User hg\n')
+            fh.write('  StrictHostKeyChecking no\n')
+            fh.write('  UserKnownHostsFile %s\n' % (ssh_dir / 'known_hosts'))
+            fh.write('  IdentityFile %s\n' % ec2_instance.ssh_private_key_path)
+
+        if not (source_path / '.hg').is_dir():
+            raise Exception('%s is not a Mercurial repository; synchronization '
+                            'not yet supported' % source_path)
+
+        env = dict(os.environ)
+        env['HGPLAIN'] = '1'
+        env['HGENCODING'] = 'utf-8'
+
+        hg_bin = source_path / 'hg'
+
+        res = subprocess.run(
+            ['python2.7', str(hg_bin), 'log', '-r', revision, '-T', '{node}'],
+            cwd=str(source_path), env=env, check=True, capture_output=True)
+
+        full_revision = res.stdout.decode('ascii')
+
+        args = [
+            'python2.7', str(hg_bin),
+            '--config', 'ui.ssh=ssh -F %s' % ssh_config,
+            '--config', 'ui.remotecmd=/hgdev/venv-bootstrap/bin/hg',
+            'push', '-f', '-r', full_revision,
+            'ssh://%s//hgwork/src' % public_ip,
+        ]
+
+        subprocess.run(args, cwd=str(source_path), env=env, check=True)
+
+        # TODO support synchronizing dirty working directory.
+
+        sftp = ec2_instance.ssh_client.open_sftp()
+
+        with sftp.open('/hgdev/hgup', 'wb') as fh:
+            fh.write(HG_UPDATE_CLEAN)
+            fh.chmod(0o0700)
+
+        chan, stdin, stdout = exec_command(
+            ec2_instance.ssh_client, '/hgdev/hgup %s' % full_revision)
+        stdin.close()
+
+        for line in stdout:
+            print(line, end='')
+
+        res = chan.recv_exit_status()
+
+        if res:
+            raise Exception('non-0 exit code updating working directory; %d'
+                            % res)
+
+
+def run_tests(ssh_client, python_version, test_flags=None):
+    """Run tests on a remote Linux machine via an SSH client."""
+    test_flags = test_flags or []
+
+    print('running tests')
+
+    if python_version == 'system2':
+        python = '/usr/bin/python2'
+    elif python_version == 'system3':
+        python = '/usr/bin/python3'
+    elif python_version.startswith('pypy'):
+        python = '/hgdev/pyenv/shims/%s' % python_version
+    else:
+        python = '/hgdev/pyenv/shims/python%s' % python_version
+
+    test_flags = ' '.join(shlex.quote(a) for a in test_flags)
+
+    command = (
+        '/bin/sh -c "export TMPDIR=/hgwork/tmp; '
+        'cd /hgwork/src/tests && %s run-tests.py %s"' % (
+            python, test_flags))
+
+    chan, stdin, stdout = exec_command(ssh_client, command)
+
+    stdin.close()
+
+    for line in stdout:
+        print(line, end='')
+
+    return chan.recv_exit_status()
diff --git a/contrib/automation/hgautomation/cli.py b/contrib/automation/hgautomation/cli.py
--- a/contrib/automation/hgautomation/cli.py
+++ b/contrib/automation/hgautomation/cli.py
@@ -8,20 +8,50 @@
 # no-check-code because Python 3 native.
 
 import argparse
+import concurrent.futures as futures
 import os
 import pathlib
+import time
 
 from . import (
     aws,
     HGAutomation,
+    linux,
     windows,
 )
 
 
 SOURCE_ROOT = pathlib.Path(os.path.abspath(__file__)).parent.parent.parent.parent
 DIST_PATH = SOURCE_ROOT / 'dist'
 
 
+def bootstrap_linux_dev(hga: HGAutomation, aws_region, distros=None,
+                        parallel=False):
+    c = hga.aws_connection(aws_region)
+
+    if distros:
+        distros = distros.split(',')
+    else:
+        distros = sorted(linux.DISTROS)
+
+    # TODO There is a wonky interaction involving KeyboardInterrupt whereby
+    # the context manager that is supposed to terminate the temporary EC2
+    # instance doesn't run. Until we fix this, make parallel building opt-in
+    # so we don't orphan instances.
+    if parallel:
+        fs = []
+
+        with futures.ThreadPoolExecutor(len(distros)) as e:
+            for distro in distros:
+                fs.append(e.submit(aws.ensure_linux_dev_ami, c, distro=distro))
+
+            for f in fs:
+                f.result()
+    else:
+        for distro in distros:
+            aws.ensure_linux_dev_ami(c, distro=distro)
+
+
 def bootstrap_windows_dev(hga: HGAutomation, aws_region):
     c = hga.aws_connection(aws_region)
     image = aws.ensure_windows_dev_ami(c)
@@ -107,6 +137,37 @@
     aws.remove_resources(c)
 
 
+def run_tests_linux(hga: HGAutomation, aws_region, instance_type,
+                    python_version, test_flags, distro, filesystem):
+    c = hga.aws_connection(aws_region)
+    image = aws.ensure_linux_dev_ami(c, distro=distro)
+
+    t_start = time.time()
+
+    ensure_extra_volume = filesystem not in ('default', 'tmpfs')
+
+    with aws.temporary_linux_dev_instances(
+        c, image, instance_type,
+        ensure_extra_volume=ensure_extra_volume) as insts:
+
+        instance = insts[0]
+
+        linux.prepare_exec_environment(instance.ssh_client,
+                                       filesystem=filesystem)
+        linux.synchronize_hg(SOURCE_ROOT, instance, '.')
+        t_prepared = time.time()
+        linux.run_tests(instance.ssh_client, python_version,
+                        test_flags)
+        t_done = time.time()
+
+    t_setup = t_prepared - t_start
+    t_all = t_done - t_start
+
+    print(
+        'total time: %.1fs; setup: %.1fs; tests: %.1fs; setup overhead: %.1f%%'
+        % (t_all, t_setup, t_done - t_prepared, t_setup / t_all * 100.0))
+
+
 def run_tests_windows(hga: HGAutomation, aws_region, instance_type,
                       python_version, arch, test_flags):
     c = hga.aws_connection(aws_region)
@@ -138,6 +199,21 @@
     subparsers = parser.add_subparsers()
 
     sp = subparsers.add_parser(
+        'bootstrap-linux-dev',
+        help='Bootstrap Linux development environments',
+    )
+    sp.add_argument(
+        '--distros',
+        help='Comma delimited list of distros to bootstrap',
+    )
+    sp.add_argument(
+        '--parallel',
+        action='store_true',
+        help='Generate AMIs in parallel (not CTRL-c safe)'
+    )
+    sp.set_defaults(func=bootstrap_linux_dev)
+
+    sp = subparsers.add_parser(
         'bootstrap-windows-dev',
         help='Bootstrap the Windows development environment',
     )
@@ -233,6 +309,41 @@
     sp.set_defaults(func=purge_ec2_resources)
 
     sp = subparsers.add_parser(
+        'run-tests-linux',
+        help='Run tests on Linux',
+    )
+    sp.add_argument(
+        '--distro',
+        help='Linux distribution to run tests on',
+        choices=linux.DISTROS,
+        default='debian9',
+    )
+    sp.add_argument(
+        '--filesystem',
+        help='Filesystem type to use',
+        choices={'btrfs', 'default', 'ext3', 'ext4', 'jfs', 'tmpfs', 'xfs'},
+        default='default',
+    )
+    sp.add_argument(
+        '--instance-type',
+        help='EC2 instance type to use',
+        default='c5.9xlarge',
+    )
+    sp.add_argument(
+        '--python-version',
+        help='Python version to use',
+        choices={'system2', 'system3', '2.7', '3.5', '3.6', '3.7', '3.8',
+                 'pypy', 'pypy3.5', 'pypy3.6'},
+        default='system2',
+    )
+    sp.add_argument(
+        'test_flags',
+        help='Extra command line flags to pass to run-tests.py',
+        nargs='*',
+    )
+    sp.set_defaults(func=run_tests_linux)
+
+    sp = subparsers.add_parser(
         'run-tests-windows',
         help='Run tests on Windows',
     )
diff --git a/contrib/automation/hgautomation/aws.py b/contrib/automation/hgautomation/aws.py
--- a/contrib/automation/hgautomation/aws.py
+++ b/contrib/automation/hgautomation/aws.py
@@ -19,6 +19,13 @@
 import boto3
 import botocore.exceptions
 
+from .linux import (
+    BOOTSTRAP_DEBIAN,
+)
+from .ssh import (
+    exec_command as ssh_exec_command,
+    wait_for_ssh,
+)
 from .winrm import (
     run_powershell,
     wait_for_winrm,
@@ -31,12 +38,46 @@
                                 'install-windows-dependencies.ps1')
 
 
+INSTANCE_TYPES_WITH_STORAGE = {
+    'c5d',
+    'd2',
+    'h1',
+    'i3',
+    'm5ad',
+    'm5d',
+    'r5d',
+    'r5ad',
+    'x1',
+    'z1d',
+}
+
+
+DEBIAN_ACCOUNT_ID = '379101102735'
+UBUNTU_ACCOUNT_ID = '099720109477'
+
+
 KEY_PAIRS = {
     'automation',
 }
 
 
 SECURITY_GROUPS = {
+    'linux-dev-1': {
+        'description': 'Mercurial Linux instances that perform build/test automation',
+        'ingress': [
+            {
+                'FromPort': 22,
+                'ToPort': 22,
+                'IpProtocol': 'tcp',
+                'IpRanges': [
+                    {
+                        'CidrIp': '0.0.0.0/0',
+                        'Description': 'SSH from entire Internet',
+                    },
+                ],
+            },
+        ],
+    },
     'windows-dev-1': {
         'description': 'Mercurial Windows instances that perform build automation',
         'ingress': [
@@ -762,6 +803,231 @@
     return image
 
 
+def ensure_linux_dev_ami(c: AWSConnection, distro='debian9', prefix='hg-'):
+    """Ensures a Linux development AMI is available and up-to-date.
+
+    Returns an ``ec2.Image`` of either an existing AMI or a newly-built one.
+    """
+    ec2client = c.ec2client
+    ec2resource = c.ec2resource
+
+    name = '%s%s-%s' % (prefix, 'linux-dev', distro)
+
+    if distro == 'debian9':
+        image = find_image(
+            ec2resource,
+            DEBIAN_ACCOUNT_ID,
+            'debian-stretch-hvm-x86_64-gp2-2019-02-19-26620',
+        )
+        ssh_username = 'admin'
+    elif distro == 'ubuntu18.04':
+        image = find_image(
+            ec2resource,
+            UBUNTU_ACCOUNT_ID,
+            'ubuntu/images/hvm-ssd/ubuntu-bionic-18.04-amd64-server-20190403',
+        )
+        ssh_username = 'ubuntu'
+    elif distro == 'ubuntu18.10':
+        image = find_image(
+            ec2resource,
+            UBUNTU_ACCOUNT_ID,
+            'ubuntu/images/hvm-ssd/ubuntu-cosmic-18.10-amd64-server-20190402',
+        )
+        ssh_username = 'ubuntu'
+    elif distro == 'ubuntu19.04':
+        image = find_image(
+            ec2resource,
+            UBUNTU_ACCOUNT_ID,
+            'ubuntu/images/hvm-ssd/ubuntu-disco-19.04-amd64-server-20190417',
+        )
+        ssh_username = 'ubuntu'
+    else:
+        raise ValueError('unsupported Linux distro: %s' % distro)
+
+    config = {
+        'BlockDeviceMappings': [
+            {
+                'DeviceName': image.block_device_mappings[0]['DeviceName'],
+                'Ebs': {
+                    'DeleteOnTermination': True,
+                    'VolumeSize': 8,
+                    'VolumeType': 'gp2',
+                },
+            },
+        ],
+        'EbsOptimized': True,
+        'ImageId': image.id,
+        'InstanceInitiatedShutdownBehavior': 'stop',
+        # 8 VCPUs for compiling Python.
+        'InstanceType': 't3.2xlarge',
+        'KeyName': '%sautomation' % prefix,
+        'MaxCount': 1,
+        'MinCount': 1,
+        'SecurityGroupIds': [c.security_groups['linux-dev-1'].id],
+    }
+
+    requirements2_path = (pathlib.Path(__file__).parent.parent /
+                          'linux-requirements-py2.txt')
+    requirements3_path = (pathlib.Path(__file__).parent.parent /
+                          'linux-requirements-py3.txt')
+    with requirements2_path.open('r', encoding='utf-8') as fh:
+        requirements2 = fh.read()
+    with requirements3_path.open('r', encoding='utf-8') as fh:
+        requirements3 = fh.read()
+
+    # Compute a deterministic fingerprint to determine whether image needs to
+    # be regenerated.
+    fingerprint = resolve_fingerprint({
+        'instance_config': config,
+        'bootstrap_script': BOOTSTRAP_DEBIAN,
+        'requirements_py2': requirements2,
+        'requirements_py3': requirements3,
+    })
+
+    existing_image = find_and_reconcile_image(ec2resource, name, fingerprint)
+
+    if existing_image:
+        return existing_image
+
+    print('no suitable %s image found; creating one...' % name)
+
+    with temporary_ec2_instances(ec2resource, config) as instances:
+        wait_for_ip_addresses(instances)
+
+        instance = instances[0]
+
+        client = wait_for_ssh(
+            instance.public_ip_address, 22,
+            username=ssh_username,
+            key_filename=str(c.key_pair_path_private('automation')))
+
+        home = '/home/%s' % ssh_username
+
+        with client:
+            print('connecting to SSH server')
+            sftp = client.open_sftp()
+
+            print('uploading bootstrap files')
+            with sftp.open('%s/bootstrap' % home, 'wb') as fh:
+                fh.write(BOOTSTRAP_DEBIAN)
+                fh.chmod(0o0700)
+
+            with sftp.open('%s/requirements-py2.txt' % home, 'wb') as fh:
+                fh.write(requirements2)
+                fh.chmod(0o0700)
+
+            with sftp.open('%s/requirements-py3.txt' % home, 'wb') as fh:
+                fh.write(requirements3)
+                fh.chmod(0o0700)
+
+            print('executing bootstrap')
+            chan, stdin, stdout = ssh_exec_command(client,
+                                                   '%s/bootstrap' % home)
+            stdin.close()
+
+            for line in stdout:
+                print(line, end='')
+
+            res = chan.recv_exit_status()
+            if res:
+                raise Exception('non-0 exit from bootstrap: %d' % res)
+
+            print('bootstrap completed; stopping %s to create %s' % (
+                  instance.id, name))
+
+        return create_ami_from_instance(ec2client, instance, name,
+                                        'Mercurial Linux development environment',
+                                        fingerprint)
+
+
+ at contextlib.contextmanager
+def temporary_linux_dev_instances(c: AWSConnection, image, instance_type,
+                                  prefix='hg-', ensure_extra_volume=False):
+    """Create temporary Linux development EC2 instances.
+
+    Context manager resolves to a list of ``ec2.Instance`` that were created
+    and are running.
+
+    ``ensure_extra_volume`` can be set to ``True`` to require that instances
+    have a 2nd storage volume available other than the primary AMI volume.
+    For instance types with instance storage, this does nothing special.
+    But for instance types without instance storage, an additional EBS volume
+    will be added to the instance.
+
+    Instances have an ``ssh_client`` attribute containing a paramiko SSHClient
+    instance bound to the instance.
+
+    Instances have an ``ssh_private_key_path`` attributing containing the
+    str path to the SSH private key to connect to the instance.
+    """
+
+    block_device_mappings = [
+        {
+            'DeviceName': image.block_device_mappings[0]['DeviceName'],
+            'Ebs': {
+                'DeleteOnTermination': True,
+                'VolumeSize': 8,
+                'VolumeType': 'gp2',
+            },
+        }
+    ]
+
+    # This is not an exhaustive list of instance types having instance storage.
+    # But
+    if (ensure_extra_volume
+        and not instance_type.startswith(tuple(INSTANCE_TYPES_WITH_STORAGE))):
+        main_device = block_device_mappings[0]['DeviceName']
+
+        if main_device == 'xvda':
+            second_device = 'xvdb'
+        elif main_device == '/dev/sda1':
+            second_device = '/dev/sdb'
+        else:
+            raise ValueError('unhandled primary EBS device name: %s' %
+                             main_device)
+
+        block_device_mappings.append({
+            'DeviceName': second_device,
+            'Ebs': {
+                'DeleteOnTermination': True,
+                'VolumeSize': 8,
+                'VolumeType': 'gp2',
+            }
+        })
+
+    config = {
+        'BlockDeviceMappings': block_device_mappings,
+        'EbsOptimized': True,
+        'ImageId': image.id,
+        'InstanceInitiatedShutdownBehavior': 'terminate',
+        'InstanceType': instance_type,
+        'KeyName': '%sautomation' % prefix,
+        'MaxCount': 1,
+        'MinCount': 1,
+        'SecurityGroupIds': [c.security_groups['linux-dev-1'].id],
+    }
+
+    with temporary_ec2_instances(c.ec2resource, config) as instances:
+        wait_for_ip_addresses(instances)
+
+        ssh_private_key_path = str(c.key_pair_path_private('automation'))
+
+        for instance in instances:
+            client = wait_for_ssh(
+                instance.public_ip_address, 22,
+                username='hg',
+                key_filename=ssh_private_key_path)
+
+            instance.ssh_client = client
+            instance.ssh_private_key_path = ssh_private_key_path
+
+        try:
+            yield instances
+        finally:
+            for instance in instances:
+                instance.ssh_client.close()
+
+
 def ensure_windows_dev_ami(c: AWSConnection, prefix='hg-'):
     """Ensure Windows Development AMI is available and up-to-date.
 
diff --git a/contrib/automation/README.rst b/contrib/automation/README.rst
--- a/contrib/automation/README.rst
+++ b/contrib/automation/README.rst
@@ -101,9 +101,14 @@
 * Storage costs for AMI / EBS snapshots. This should be just a few pennies
   per month.
 
-When running EC2 instances, you'll be billed accordingly. By default, we
-use *small* instances, like ``t3.medium``. This instance type costs ~$0.07 per
-hour.
+When running EC2 instances, you'll be billed accordingly. Default instance
+types vary by operation. We try to be respectful of your money when choosing
+defaults. e.g. for Windows instances which are billed per hour, we use e.g.
+``t3.medium`` instances, which cost ~$0.07 per hour. For operations that
+scale well to many CPUs like running Linux tests, we may use a more powerful
+instance like ``c5.9xlarge``. However, since Linux instances are billed
+per second and the cost of running an e.g. ``c5.9xlarge`` for half the time
+of a ``c5.4xlarge`` is roughly the same, the choice is justified.
 
 .. note::
 
@@ -125,3 +130,54 @@
 To purge all EC2 resources that we manage::
 
    $ automation.py purge-ec2-resources
+
+Remote Machine Interfaces
+=========================
+
+The code that connects to a remote machine and executes things is
+theoretically machine agnostic as long as the remote machine conforms to
+an *interface*. In other words, to perform actions like running tests
+remotely or triggering packaging, it shouldn't matter if the remote machine
+is an EC2 instance, a virtual machine, etc. This section attempts to document
+the interface that remote machines need to provide in order to be valid
+*targets* for remote execution. These interfaces are often not ideal nor
+the most flexible. Instead, they have often evolved as the requirements of
+our automation code have evolved.
+
+Linux
+-----
+
+Remote Linux machines expose an SSH server on port 22. The SSH server
+must allow the ``hg`` user to authenticate using the SSH key generated by
+the automation code. The ``hg`` user should be part of the ``hg`` group
+and it should have ``sudo`` access without password prompting.
+
+The SSH channel must support SFTP to facilitate transferring files from
+client to server.
+
+``/bin/bash`` must be executable and point to a bash shell executable.
+
+The ``/hgdev`` directory must exist and all its content owned by ``hg::hg``.
+
+The ``/hgdev/pyenv`` directory should contain an installation of
+``pyenv``. Various Python distributions should be installed. The exact
+versions shouldn't matter. ``pyenv global`` should have been run so
+``/hgdev/pyenv/shims/`` is populated with redirector scripts that point
+to the appropriate Python executable.
+
+The ``/hgdev/venv-bootstrap`` directory must contain a virtualenv
+with Mercurial installed. The ``/hgdev/venv-bootstrap/bin/hg`` executable
+is referenced by various scripts and the client.
+
+The ``/hgdev/src`` directory MUST contain a clone of the Mercurial
+source code. The state of the working directory is not important.
+
+In order to run tests, the ``/hgwork`` directory will be created.
+This may require running various ``mkfs.*`` executables and ``mount``
+to provision a new filesystem. This will require elevated privileges
+via ``sudo``.
+
+Various dependencies to run the Mercurial test harness are also required.
+Documenting them is beyond the scope of this document. Various tests
+also require other optional dependencies and missing dependencies will
+be printed by the test runner when a test is skipped.



To: indygreg, #hg-reviewers
Cc: mjpieters, mercurial-devel


More information about the Mercurial-devel mailing list