Windows binaries for Hadoop versions

It's happened because Hadoop config is initialized each time when spark context is created regardless is hadoop required or not. I propose to add some special flag to indicate if hadoop config is required (or start this configuration manually). I am getting the following error while starting namenode for latest hadoop-2.2 release. I didn't find winutils exe file in hadoop bin folder. I tried below commands $ bin/hdfs namenode -format $. Sep 15, 2015 - This is implemented in HADOOP.DLL and WINUTILS.EXE. In particular,%HADOOP_HOME% BIN WINUTILS.EXE must be locatable. If it is not,.

These are built directly from the same git commit used to create the official ASF releases; they are checked outand built on a windows VM which is dedicated purely to testing Hadoop/YARN apps on Windows. It is not a day-to-dayused system so is isolated from driveby/email security attacks.

Security: can you trust this release?

  1. I am the Hadoop committer ' stevel': I have nothing to gain by creating malicious versions of these binaries. If I wanted to run anything on your systems, I'd be able to add the code into Hadoop itself.
  2. I'm signing the releases.
  3. My keys are published on the ASF committer keylist under my username.
  4. The latest GPG key (E7E4 26DF 6228 1B63 D679 6A81 950C C3E0 32B7 9CA2) actually lives on a yubikey for physical security; the signing takes place there.
  5. The same pubikey key is used for 2FA to github, for uploading artifacts and making the release.
Winutils exe

Someone malicious would need physical access to my office to sign artifacts under my name. If they could do that, they could commit malicious code into Hadoop itself, even signing those commits with the same GPG key. Though they'd need the pin number to unlock the key, which I have to type in whenever the laptop wakes up and I want to sign something. That'd take getting something malicious onto my machine, or sniffing the bluetooth packets from the keyboard to laptop. Were someone to get physical access to my machine, they could probably install a malicous version of git, one which modified code before the checkin. I don't actually my patches to verify that there's been no tampering, but we do tend to keep an eye on what our peers put in.

The other tactic would have been for a malicious yubikey to end up being delivered by Amazon to my house. I don't have any defences against anyone going to that level of effort.

2017-12 Update That key has been revoked, though it was never actually compromised. Lack of randomness in the prime number generator on the yubikey, hencean emergency cancel session. Not set things up properly again.

Note: Artifacts prior to Hadoop 2.8.0-RC3 [were signed with a different key](https://pgp.mit.edu/pks/lookup?op=vindex&search=0xA92454F9174786B4; again, on the ASF key list.

Build Process

A dedicated Windows Server 2012 VM is used for building and testing Hadoop stack artifacts. It is not used for anything else.

This uses a VS build setup from 2010; compiler and linker version: 16.00.30319.01 for x64

Maven 3.3.9 was used; signature checked to be that of Jason@maven.org. While my key list doesn't directly trust that signature, I do trust that of other signatorees:

Java 1.8:

release process

Windows VM

Hadoop Winutils Download

In hadoop-trunk

The version to build is checked out from the declared SHA1 checksum of the release/RC, hopefully moving to signed tags once signing becomes more common there.

The build was executed, relying on the fact that the native-win profile is automatic on Windows:

This creates a distribution, with the native binaries under hadoop-disttargethadoop-X.Y.Zbin

Winutils

Create a zip file containing the contents of the winutils%VERSION%. This is done on the windows machine to avoid any risk of the windows line-ending files getting modified by git. This isn't committed to git, just copied over to the host VM via the mounted file share.

Host machine: Sign everything

Pull down the newly added files from github, then sign the binary ones and push the .asc signatures back.

There isn't a way to sign multiple files in gpg2 on the command line, so it's either write a loop in bash or just edit the line and let path completion simplify your life. Here's the list of sign commands:

verify the existence of files, then

Then go to the directory with the zip file and sign that file too

github, create the release

  1. Go to the github repository
  2. Verify the most recent commit is visible
  3. Tag the release with the hadoop version, include the commit checksum used to build off
  4. Drop in the .zip and .zip.asc files as binary artifacts
Posted on  by  admin