Search in sources :

Example 1 with SaslPropertiesResolver

use of org.apache.hadoop.security.SaslPropertiesResolver in project hadoop by apache.

the class DataNode method checkSecureConfig.

/**
   * Checks if the DataNode has a secure configuration if security is enabled.
   * There are 2 possible configurations that are considered secure:
   * 1. The server has bound to privileged ports for RPC and HTTP via
   *   SecureDataNodeStarter.
   * 2. The configuration enables SASL on DataTransferProtocol and HTTPS (no
   *   plain HTTP) for the HTTP server.  The SASL handshake guarantees
   *   authentication of the RPC server before a client transmits a secret, such
   *   as a block access token.  Similarly, SSL guarantees authentication of the
   *   HTTP server before a client transmits a secret, such as a delegation
   *   token.
   * It is not possible to run with both privileged ports and SASL on
   * DataTransferProtocol.  For backwards-compatibility, the connection logic
   * must check if the target port is a privileged port, and if so, skip the
   * SASL handshake.
   *
   * @param dnConf DNConf to check
   * @param conf Configuration to check
   * @param resources SecuredResources obtained for DataNode
   * @throws RuntimeException if security enabled, but configuration is insecure
   */
private static void checkSecureConfig(DNConf dnConf, Configuration conf, SecureResources resources) throws RuntimeException {
    if (!UserGroupInformation.isSecurityEnabled()) {
        return;
    }
    // Abort out of inconsistent state if Kerberos is enabled
    // but block access tokens are not enabled.
    boolean isEnabled = conf.getBoolean(DFSConfigKeys.DFS_BLOCK_ACCESS_TOKEN_ENABLE_KEY, DFSConfigKeys.DFS_BLOCK_ACCESS_TOKEN_ENABLE_DEFAULT);
    if (!isEnabled) {
        String errMessage = "Security is enabled but block access tokens " + "(via " + DFSConfigKeys.DFS_BLOCK_ACCESS_TOKEN_ENABLE_KEY + ") " + "aren't enabled. This may cause issues " + "when clients attempt to connect to a DataNode. Aborting DataNode";
        throw new RuntimeException(errMessage);
    }
    SaslPropertiesResolver saslPropsResolver = dnConf.getSaslPropsResolver();
    if (resources != null && saslPropsResolver == null) {
        return;
    }
    if (dnConf.getIgnoreSecurePortsForTesting()) {
        return;
    }
    if (saslPropsResolver != null && DFSUtil.getHttpPolicy(conf) == HttpConfig.Policy.HTTPS_ONLY && resources == null) {
        return;
    }
    throw new RuntimeException("Cannot start secure DataNode without " + "configuring either privileged resources or SASL RPC data transfer " + "protection and SSL for HTTP.  Using privileged resources in " + "combination with SASL RPC data transfer protection is not supported.");
}
Also used : SaslPropertiesResolver(org.apache.hadoop.security.SaslPropertiesResolver)

Example 2 with SaslPropertiesResolver

use of org.apache.hadoop.security.SaslPropertiesResolver in project hbase by apache.

the class FanOutOneBlockAsyncDFSOutputSaslHelper method createSaslAdaptor.

private static SaslAdaptor createSaslAdaptor() throws NoSuchFieldException, NoSuchMethodException {
    Field saslPropsResolverField = SaslDataTransferClient.class.getDeclaredField("saslPropsResolver");
    saslPropsResolverField.setAccessible(true);
    Field trustedChannelResolverField = SaslDataTransferClient.class.getDeclaredField("trustedChannelResolver");
    trustedChannelResolverField.setAccessible(true);
    Field fallbackToSimpleAuthField = SaslDataTransferClient.class.getDeclaredField("fallbackToSimpleAuth");
    fallbackToSimpleAuthField.setAccessible(true);
    return new SaslAdaptor() {

        @Override
        public TrustedChannelResolver getTrustedChannelResolver(SaslDataTransferClient saslClient) {
            try {
                return (TrustedChannelResolver) trustedChannelResolverField.get(saslClient);
            } catch (IllegalAccessException e) {
                throw new RuntimeException(e);
            }
        }

        @Override
        public SaslPropertiesResolver getSaslPropsResolver(SaslDataTransferClient saslClient) {
            try {
                return (SaslPropertiesResolver) saslPropsResolverField.get(saslClient);
            } catch (IllegalAccessException e) {
                throw new RuntimeException(e);
            }
        }

        @Override
        public AtomicBoolean getFallbackToSimpleAuth(SaslDataTransferClient saslClient) {
            try {
                return (AtomicBoolean) fallbackToSimpleAuthField.get(saslClient);
            } catch (IllegalAccessException e) {
                throw new RuntimeException(e);
            }
        }
    };
}
Also used : Field(java.lang.reflect.Field) AtomicBoolean(java.util.concurrent.atomic.AtomicBoolean) TrustedChannelResolver(org.apache.hadoop.hdfs.protocol.datatransfer.TrustedChannelResolver) SaslPropertiesResolver(org.apache.hadoop.security.SaslPropertiesResolver) SaslDataTransferClient(org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient)

Example 3 with SaslPropertiesResolver

use of org.apache.hadoop.security.SaslPropertiesResolver in project hbase by apache.

the class FanOutOneBlockAsyncDFSOutputSaslHelper method trySaslNegotiate.

static void trySaslNegotiate(Configuration conf, Channel channel, DatanodeInfo dnInfo, int timeoutMs, DFSClient client, Token<BlockTokenIdentifier> accessToken, Promise<Void> saslPromise) throws IOException {
    SaslDataTransferClient saslClient = client.getSaslDataTransferClient();
    SaslPropertiesResolver saslPropsResolver = SASL_ADAPTOR.getSaslPropsResolver(saslClient);
    TrustedChannelResolver trustedChannelResolver = SASL_ADAPTOR.getTrustedChannelResolver(saslClient);
    AtomicBoolean fallbackToSimpleAuth = SASL_ADAPTOR.getFallbackToSimpleAuth(saslClient);
    InetAddress addr = ((InetSocketAddress) channel.remoteAddress()).getAddress();
    if (trustedChannelResolver.isTrusted() || trustedChannelResolver.isTrusted(addr)) {
        saslPromise.trySuccess(null);
        return;
    }
    DataEncryptionKey encryptionKey = client.newDataEncryptionKey();
    if (encryptionKey != null) {
        if (LOG.isDebugEnabled()) {
            LOG.debug("SASL client doing encrypted handshake for addr = " + addr + ", datanodeId = " + dnInfo);
        }
        doSaslNegotiation(conf, channel, timeoutMs, getUserNameFromEncryptionKey(encryptionKey), encryptionKeyToPassword(encryptionKey.encryptionKey), createSaslPropertiesForEncryption(encryptionKey.encryptionAlgorithm), saslPromise);
    } else if (!UserGroupInformation.isSecurityEnabled()) {
        if (LOG.isDebugEnabled()) {
            LOG.debug("SASL client skipping handshake in unsecured configuration for addr = " + addr + ", datanodeId = " + dnInfo);
        }
        saslPromise.trySuccess(null);
    } else if (dnInfo.getXferPort() < 1024) {
        if (LOG.isDebugEnabled()) {
            LOG.debug("SASL client skipping handshake in secured configuration with " + "privileged port for addr = " + addr + ", datanodeId = " + dnInfo);
        }
        saslPromise.trySuccess(null);
    } else if (fallbackToSimpleAuth != null && fallbackToSimpleAuth.get()) {
        if (LOG.isDebugEnabled()) {
            LOG.debug("SASL client skipping handshake in secured configuration with " + "unsecured cluster for addr = " + addr + ", datanodeId = " + dnInfo);
        }
        saslPromise.trySuccess(null);
    } else if (saslPropsResolver != null) {
        if (LOG.isDebugEnabled()) {
            LOG.debug("SASL client doing general handshake for addr = " + addr + ", datanodeId = " + dnInfo);
        }
        doSaslNegotiation(conf, channel, timeoutMs, buildUsername(accessToken), buildClientPassword(accessToken), saslPropsResolver.getClientProperties(addr), saslPromise);
    } else {
        // edge case.
        if (LOG.isDebugEnabled()) {
            LOG.debug("SASL client skipping handshake in secured configuration with no SASL " + "protection configured for addr = " + addr + ", datanodeId = " + dnInfo);
        }
        saslPromise.trySuccess(null);
    }
}
Also used : AtomicBoolean(java.util.concurrent.atomic.AtomicBoolean) DataEncryptionKey(org.apache.hadoop.hdfs.security.token.block.DataEncryptionKey) InetSocketAddress(java.net.InetSocketAddress) TrustedChannelResolver(org.apache.hadoop.hdfs.protocol.datatransfer.TrustedChannelResolver) SaslPropertiesResolver(org.apache.hadoop.security.SaslPropertiesResolver) InetAddress(java.net.InetAddress) SaslDataTransferClient(org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient)

Example 4 with SaslPropertiesResolver

use of org.apache.hadoop.security.SaslPropertiesResolver in project hadoop by apache.

the class DataTransferSaslUtil method getSaslPropertiesResolver.

/**
   * Creates a SaslPropertiesResolver from the given configuration.  This method
   * works by cloning the configuration, translating configuration properties
   * specific to DataTransferProtocol to what SaslPropertiesResolver expects,
   * and then delegating to SaslPropertiesResolver for initialization.  This
   * method returns null if SASL protection has not been configured for
   * DataTransferProtocol.
   *
   * @param conf configuration to read
   * @return SaslPropertiesResolver for DataTransferProtocol, or null if not
   *   configured
   */
public static SaslPropertiesResolver getSaslPropertiesResolver(Configuration conf) {
    String qops = conf.get(DFS_DATA_TRANSFER_PROTECTION_KEY);
    if (qops == null || qops.isEmpty()) {
        LOG.debug("DataTransferProtocol not using SaslPropertiesResolver, no " + "QOP found in configuration for {}", DFS_DATA_TRANSFER_PROTECTION_KEY);
        return null;
    }
    Configuration saslPropsResolverConf = new Configuration(conf);
    saslPropsResolverConf.set(HADOOP_RPC_PROTECTION, qops);
    Class<? extends SaslPropertiesResolver> resolverClass = conf.getClass(HADOOP_SECURITY_SASL_PROPS_RESOLVER_CLASS, SaslPropertiesResolver.class, SaslPropertiesResolver.class);
    resolverClass = conf.getClass(DFS_DATA_TRANSFER_SASL_PROPS_RESOLVER_CLASS_KEY, resolverClass, SaslPropertiesResolver.class);
    saslPropsResolverConf.setClass(HADOOP_SECURITY_SASL_PROPS_RESOLVER_CLASS, resolverClass, SaslPropertiesResolver.class);
    SaslPropertiesResolver resolver = SaslPropertiesResolver.getInstance(saslPropsResolverConf);
    LOG.debug("DataTransferProtocol using SaslPropertiesResolver, configured " + "QOP {} = {}, configured class {} = {}", DFS_DATA_TRANSFER_PROTECTION_KEY, qops, DFS_DATA_TRANSFER_SASL_PROPS_RESOLVER_CLASS_KEY, resolverClass);
    return resolver;
}
Also used : Configuration(org.apache.hadoop.conf.Configuration) ByteString(com.google.protobuf.ByteString) SaslPropertiesResolver(org.apache.hadoop.security.SaslPropertiesResolver)

Example 5 with SaslPropertiesResolver

use of org.apache.hadoop.security.SaslPropertiesResolver in project hadoop by apache.

the class SaslDataTransferServer method getSaslStreams.

/**
   * Receives SASL negotiation for general-purpose handshake.
   *
   * @param peer connection peer
   * @param underlyingOut connection output stream
   * @param underlyingIn connection input stream
   * @return new pair of streams, wrapped after SASL negotiation
   * @throws IOException for any error
   */
private IOStreamPair getSaslStreams(Peer peer, OutputStream underlyingOut, InputStream underlyingIn) throws IOException {
    if (peer.hasSecureChannel() || dnConf.getTrustedChannelResolver().isTrusted(getPeerAddress(peer))) {
        return new IOStreamPair(underlyingIn, underlyingOut);
    }
    SaslPropertiesResolver saslPropsResolver = dnConf.getSaslPropsResolver();
    Map<String, String> saslProps = saslPropsResolver.getServerProperties(getPeerAddress(peer));
    CallbackHandler callbackHandler = new SaslServerCallbackHandler(new PasswordFunction() {

        @Override
        public char[] apply(String userName) throws IOException {
            return buildServerPassword(userName);
        }
    });
    return doSaslHandshake(peer, underlyingOut, underlyingIn, saslProps, callbackHandler);
}
Also used : CallbackHandler(javax.security.auth.callback.CallbackHandler) IOStreamPair(org.apache.hadoop.hdfs.protocol.datatransfer.IOStreamPair) IOException(java.io.IOException) SaslPropertiesResolver(org.apache.hadoop.security.SaslPropertiesResolver)

Aggregations

SaslPropertiesResolver (org.apache.hadoop.security.SaslPropertiesResolver)5 AtomicBoolean (java.util.concurrent.atomic.AtomicBoolean)2 TrustedChannelResolver (org.apache.hadoop.hdfs.protocol.datatransfer.TrustedChannelResolver)2 SaslDataTransferClient (org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient)2 ByteString (com.google.protobuf.ByteString)1 IOException (java.io.IOException)1 Field (java.lang.reflect.Field)1 InetAddress (java.net.InetAddress)1 InetSocketAddress (java.net.InetSocketAddress)1 CallbackHandler (javax.security.auth.callback.CallbackHandler)1 Configuration (org.apache.hadoop.conf.Configuration)1 IOStreamPair (org.apache.hadoop.hdfs.protocol.datatransfer.IOStreamPair)1 DataEncryptionKey (org.apache.hadoop.hdfs.security.token.block.DataEncryptionKey)1