Search in sources :

Example 1 with NonNull

use of lombok.NonNull in project metacat by Netflix.

the class HiveConnectorFastPartitionService method getPartitionCount.

/**
     * Number of partitions for the given table.
     *
     * @param tableName tableName
     * @return Number of partitions
     */
@Override
public int getPartitionCount(@Nonnull @NonNull final ConnectorContext requestContext, @Nonnull @NonNull final QualifiedName tableName) {
    final long start = registry.clock().monotonicTime();
    final Map<String, String> tags = new HashMap<String, String>();
    tags.put("request", HiveMetrics.getPartitionCount.name());
    final Integer result;
    final DataSource dataSource = DataSourceManager.get().get(catalogName);
    try (Connection conn = dataSource.getConnection()) {
        // Handler for reading the result set
        final ResultSetHandler<Integer> handler = rs -> {
            int count = 0;
            while (rs.next()) {
                count = rs.getInt("count");
            }
            return count;
        };
        result = new QueryRunner().query(conn, SQL_GET_PARTITION_COUNT, handler, tableName.getDatabaseName(), tableName.getTableName());
    } catch (SQLException e) {
        throw new ConnectorException("getPartitionCount", e);
    } finally {
        final long duration = registry.clock().monotonicTime() - start;
        log.debug("### Time taken to complete getPartitionCount is {} ms", duration);
        this.registry.timer(requestTimerId.withTags(tags)).record(duration, TimeUnit.MILLISECONDS);
    }
    return result;
}
Also used : Connection(java.sql.Connection) PartitionKeyParserEval(com.netflix.metacat.common.server.partition.visitor.PartitionKeyParserEval) Date(java.util.Date) PartitionFilterGenerator(com.netflix.metacat.connector.hive.util.PartitionFilterGenerator) PartitionParamParserEval(com.netflix.metacat.common.server.partition.visitor.PartitionParamParserEval) ConnectorException(com.netflix.metacat.common.server.connectors.exception.ConnectorException) PartitionInfo(com.netflix.metacat.common.server.connectors.model.PartitionInfo) Map(java.util.Map) ConnectorContext(com.netflix.metacat.common.server.connectors.ConnectorContext) StorageInfo(com.netflix.metacat.common.server.connectors.model.StorageInfo) QueryRunner(org.apache.commons.dbutils.QueryRunner) NonNull(lombok.NonNull) Collection(java.util.Collection) Pageable(com.netflix.metacat.common.dto.Pageable) QualifiedName(com.netflix.metacat.common.QualifiedName) Instant(java.time.Instant) Collectors(java.util.stream.Collectors) HiveMetrics(com.netflix.metacat.connector.hive.monitoring.HiveMetrics) Slf4j(lombok.extern.slf4j.Slf4j) List(java.util.List) ResultSetHandler(org.apache.commons.dbutils.ResultSetHandler) Joiner(com.google.common.base.Joiner) Sort(com.netflix.metacat.common.dto.Sort) ListenableFuture(com.google.common.util.concurrent.ListenableFuture) AuditInfo(com.netflix.metacat.common.server.connectors.model.AuditInfo) HashMap(java.util.HashMap) Id(com.netflix.spectator.api.Id) ArrayList(java.util.ArrayList) Inject(javax.inject.Inject) Strings(com.google.common.base.Strings) SQLException(java.sql.SQLException) Lists(com.google.common.collect.Lists) ThreadServiceManager(com.netflix.metacat.common.server.util.ThreadServiceManager) DataSource(javax.sql.DataSource) PartitionParser(com.netflix.metacat.common.server.partition.parser.PartitionParser) Named(javax.inject.Named) HiveConnectorInfoConverter(com.netflix.metacat.connector.hive.converters.HiveConnectorInfoConverter) Nonnull(javax.annotation.Nonnull) Nullable(javax.annotation.Nullable) PartitionDetail(com.netflix.metacat.connector.hive.util.PartitionDetail) Functions(com.google.common.base.Functions) DataSourceManager(com.netflix.metacat.common.server.util.DataSourceManager) Throwables(com.google.common.base.Throwables) Maps(com.google.common.collect.Maps) FilterPartition(com.netflix.metacat.common.server.partition.util.FilterPartition) TimeUnit(java.util.concurrent.TimeUnit) FieldSchema(org.apache.hadoop.hive.metastore.api.FieldSchema) Futures(com.google.common.util.concurrent.Futures) StringReader(java.io.StringReader) Registry(com.netflix.spectator.api.Registry) PartitionListRequest(com.netflix.metacat.common.server.connectors.model.PartitionListRequest) HashMap(java.util.HashMap) SQLException(java.sql.SQLException) ConnectorException(com.netflix.metacat.common.server.connectors.exception.ConnectorException) Connection(java.sql.Connection) QueryRunner(org.apache.commons.dbutils.QueryRunner) DataSource(javax.sql.DataSource)

Example 2 with NonNull

use of lombok.NonNull in project metacat by Netflix.

the class HiveConnectorFastPartitionService method getpartitions.

private List<PartitionInfo> getpartitions(@Nonnull @NonNull final String databaseName, @Nonnull @NonNull final String tableName, @Nullable final List<String> partitionIds, final String filterExpression, final Sort sort, final Pageable pageable, final boolean includePartitionDetails) {
    final FilterPartition filter = new FilterPartition();
    // batch exists
    final boolean isBatched = !Strings.isNullOrEmpty(filterExpression) && filterExpression.contains(FIELD_BATCHID);
    final boolean hasDateCreated = !Strings.isNullOrEmpty(filterExpression) && filterExpression.contains(FIELD_DATE_CREATED);
    // Handler for reading the result set
    final ResultSetHandler<List<PartitionDetail>> handler = rs -> {
        final List<PartitionDetail> result = Lists.newArrayList();
        while (rs.next()) {
            final String name = rs.getString("name");
            final String uri = rs.getString("uri");
            final long createdDate = rs.getLong(FIELD_DATE_CREATED);
            Map<String, String> values = null;
            if (hasDateCreated) {
                values = Maps.newHashMap();
                values.put(FIELD_DATE_CREATED, createdDate + "");
            }
            if (Strings.isNullOrEmpty(filterExpression) || filter.evaluatePartitionExpression(filterExpression, name, uri, isBatched, values)) {
                final Long id = rs.getLong("id");
                final Long sdId = rs.getLong("sd_id");
                final Long serdeId = rs.getLong("serde_id");
                final String inputFormat = rs.getString("input_format");
                final String outputFormat = rs.getString("output_format");
                final String serializationLib = rs.getString("slib");
                final StorageInfo storageInfo = new StorageInfo();
                storageInfo.setUri(uri);
                storageInfo.setInputFormat(inputFormat);
                storageInfo.setOutputFormat(outputFormat);
                storageInfo.setSerializationLib(serializationLib);
                final AuditInfo auditInfo = new AuditInfo();
                auditInfo.setCreatedDate(Date.from(Instant.ofEpochSecond(createdDate)));
                auditInfo.setLastModifiedDate(Date.from(Instant.ofEpochSecond(createdDate)));
                result.add(new PartitionDetail(id, sdId, serdeId, PartitionInfo.builder().name(QualifiedName.ofPartition(catalogName, databaseName, tableName, name)).auditInfo(auditInfo).serde(storageInfo).build()));
            }
        }
        return result;
    };
    final List<PartitionInfo> partitionInfos = new ArrayList<>();
    final List<PartitionDetail> partitions = getHandlerResults(databaseName, tableName, filterExpression, partitionIds, SQL_GET_PARTITIONS, handler, sort, pageable);
    if (includePartitionDetails && !partitions.isEmpty()) {
        final List<Long> partIds = Lists.newArrayListWithCapacity(partitions.size());
        final List<Long> sdIds = Lists.newArrayListWithCapacity(partitions.size());
        final List<Long> serdeIds = Lists.newArrayListWithCapacity(partitions.size());
        for (PartitionDetail partitionDetail : partitions) {
            partIds.add(partitionDetail.getId());
            sdIds.add(partitionDetail.getSdId());
            serdeIds.add(partitionDetail.getSerdeId());
        }
        final List<ListenableFuture<Void>> futures = Lists.newArrayList();
        final Map<Long, Map<String, String>> partitionParams = Maps.newHashMap();
        futures.add(threadServiceManager.getExecutor().submit(() -> populateParameters(partIds, SQL_GET_PARTITION_PARAMS, "part_id", partitionParams)));
        final Map<Long, Map<String, String>> sdParams = Maps.newHashMap();
        if (!sdIds.isEmpty()) {
            futures.add(threadServiceManager.getExecutor().submit(() -> populateParameters(sdIds, SQL_GET_SD_PARAMS, "sd_id", sdParams)));
        }
        final Map<Long, Map<String, String>> serdeParams = Maps.newHashMap();
        if (!serdeIds.isEmpty()) {
            futures.add(threadServiceManager.getExecutor().submit(() -> populateParameters(serdeIds, SQL_GET_SERDE_PARAMS, "serde_id", serdeParams)));
        }
        try {
            Futures.transform(Futures.successfulAsList(futures), Functions.constant(null)).get(1, TimeUnit.HOURS);
        } catch (Exception e) {
            Throwables.propagate(e);
        }
        for (PartitionDetail partitionDetail : partitions) {
            partitionDetail.getPartitionInfo().setMetadata(partitionParams.get(partitionDetail.getId()));
            partitionDetail.getPartitionInfo().getSerde().setParameters(sdParams.get(partitionDetail.getSdId()));
            partitionDetail.getPartitionInfo().getSerde().setSerdeInfoParameters(serdeParams.get(partitionDetail.getSerdeId()));
        }
    }
    for (PartitionDetail partitionDetail : partitions) {
        partitionInfos.add(partitionDetail.getPartitionInfo());
    }
    return partitionInfos;
}
Also used : Connection(java.sql.Connection) PartitionKeyParserEval(com.netflix.metacat.common.server.partition.visitor.PartitionKeyParserEval) Date(java.util.Date) PartitionFilterGenerator(com.netflix.metacat.connector.hive.util.PartitionFilterGenerator) PartitionParamParserEval(com.netflix.metacat.common.server.partition.visitor.PartitionParamParserEval) ConnectorException(com.netflix.metacat.common.server.connectors.exception.ConnectorException) PartitionInfo(com.netflix.metacat.common.server.connectors.model.PartitionInfo) Map(java.util.Map) ConnectorContext(com.netflix.metacat.common.server.connectors.ConnectorContext) StorageInfo(com.netflix.metacat.common.server.connectors.model.StorageInfo) QueryRunner(org.apache.commons.dbutils.QueryRunner) NonNull(lombok.NonNull) Collection(java.util.Collection) Pageable(com.netflix.metacat.common.dto.Pageable) QualifiedName(com.netflix.metacat.common.QualifiedName) Instant(java.time.Instant) Collectors(java.util.stream.Collectors) HiveMetrics(com.netflix.metacat.connector.hive.monitoring.HiveMetrics) Slf4j(lombok.extern.slf4j.Slf4j) List(java.util.List) ResultSetHandler(org.apache.commons.dbutils.ResultSetHandler) Joiner(com.google.common.base.Joiner) Sort(com.netflix.metacat.common.dto.Sort) ListenableFuture(com.google.common.util.concurrent.ListenableFuture) AuditInfo(com.netflix.metacat.common.server.connectors.model.AuditInfo) HashMap(java.util.HashMap) Id(com.netflix.spectator.api.Id) ArrayList(java.util.ArrayList) Inject(javax.inject.Inject) Strings(com.google.common.base.Strings) SQLException(java.sql.SQLException) Lists(com.google.common.collect.Lists) ThreadServiceManager(com.netflix.metacat.common.server.util.ThreadServiceManager) DataSource(javax.sql.DataSource) PartitionParser(com.netflix.metacat.common.server.partition.parser.PartitionParser) Named(javax.inject.Named) HiveConnectorInfoConverter(com.netflix.metacat.connector.hive.converters.HiveConnectorInfoConverter) Nonnull(javax.annotation.Nonnull) Nullable(javax.annotation.Nullable) PartitionDetail(com.netflix.metacat.connector.hive.util.PartitionDetail) Functions(com.google.common.base.Functions) DataSourceManager(com.netflix.metacat.common.server.util.DataSourceManager) Throwables(com.google.common.base.Throwables) Maps(com.google.common.collect.Maps) FilterPartition(com.netflix.metacat.common.server.partition.util.FilterPartition) TimeUnit(java.util.concurrent.TimeUnit) FieldSchema(org.apache.hadoop.hive.metastore.api.FieldSchema) Futures(com.google.common.util.concurrent.Futures) StringReader(java.io.StringReader) Registry(com.netflix.spectator.api.Registry) PartitionListRequest(com.netflix.metacat.common.server.connectors.model.PartitionListRequest) AuditInfo(com.netflix.metacat.common.server.connectors.model.AuditInfo) FilterPartition(com.netflix.metacat.common.server.partition.util.FilterPartition) ArrayList(java.util.ArrayList) PartitionDetail(com.netflix.metacat.connector.hive.util.PartitionDetail) ConnectorException(com.netflix.metacat.common.server.connectors.exception.ConnectorException) SQLException(java.sql.SQLException) StorageInfo(com.netflix.metacat.common.server.connectors.model.StorageInfo) ListenableFuture(com.google.common.util.concurrent.ListenableFuture) List(java.util.List) ArrayList(java.util.ArrayList) PartitionInfo(com.netflix.metacat.common.server.connectors.model.PartitionInfo) Map(java.util.Map) HashMap(java.util.HashMap)

Example 3 with NonNull

use of lombok.NonNull in project metacat by Netflix.

the class HiveConnectorTableService method list.

/**
     * {@inheritDoc}.
     */
@Override
public List<TableInfo> list(@Nonnull @NonNull final ConnectorContext requestContext, @Nonnull @NonNull final QualifiedName name, @Nullable final QualifiedName prefix, @Nullable final Sort sort, @Nullable final Pageable pageable) {
    try {
        final List<TableInfo> tableInfos = Lists.newArrayList();
        for (String tableName : metacatHiveClient.getAllTables(name.getDatabaseName())) {
            final QualifiedName qualifiedName = QualifiedName.ofDatabase(name.getCatalogName(), tableName);
            if (!qualifiedName.toString().startsWith(prefix.toString())) {
                continue;
            }
            final Table table = metacatHiveClient.getTableByName(name.getDatabaseName(), tableName);
            tableInfos.add(hiveMetacatConverters.toTableInfo(name, table));
        }
        //supporting sort by name only
        if (sort != null) {
            ConnectorUtils.sort(tableInfos, sort, Comparator.comparing(p -> p.getName().getTableName()));
        }
        return ConnectorUtils.paginate(tableInfos, pageable);
    } catch (MetaException exception) {
        throw new DatabaseNotFoundException(name, exception);
    } catch (TException exception) {
        throw new ConnectorException(String.format("Failed list hive table %s", name), exception);
    }
}
Also used : MetaException(org.apache.hadoop.hive.metastore.api.MetaException) SerDeInfo(org.apache.hadoop.hive.metastore.api.SerDeInfo) DatabaseNotFoundException(com.netflix.metacat.common.server.connectors.exception.DatabaseNotFoundException) AlreadyExistsException(org.apache.hadoop.hive.metastore.api.AlreadyExistsException) Inject(javax.inject.Inject) Strings(com.google.common.base.Strings) ConnectorTableService(com.netflix.metacat.common.server.connectors.ConnectorTableService) FieldInfo(com.netflix.metacat.common.server.connectors.model.FieldInfo) InvalidMetaException(com.netflix.metacat.common.server.connectors.exception.InvalidMetaException) Lists(com.google.common.collect.Lists) ImmutableList(com.google.common.collect.ImmutableList) ConnectorException(com.netflix.metacat.common.server.connectors.exception.ConnectorException) Map(java.util.Map) Path(org.apache.hadoop.fs.Path) ConnectorContext(com.netflix.metacat.common.server.connectors.ConnectorContext) StorageInfo(com.netflix.metacat.common.server.connectors.model.StorageInfo) Named(javax.inject.Named) HiveConnectorInfoConverter(com.netflix.metacat.connector.hive.converters.HiveConnectorInfoConverter) StorageDescriptor(org.apache.hadoop.hive.metastore.api.StorageDescriptor) Nonnull(javax.annotation.Nonnull) Nullable(javax.annotation.Nullable) ImmutableMap(com.google.common.collect.ImmutableMap) NonNull(lombok.NonNull) Pageable(com.netflix.metacat.common.dto.Pageable) TException(org.apache.thrift.TException) QualifiedName(com.netflix.metacat.common.QualifiedName) InvalidObjectException(org.apache.hadoop.hive.metastore.api.InvalidObjectException) TableNotFoundException(com.netflix.metacat.common.server.connectors.exception.TableNotFoundException) Maps(com.google.common.collect.Maps) Table(org.apache.hadoop.hive.metastore.api.Table) FieldSchema(org.apache.hadoop.hive.metastore.api.FieldSchema) List(java.util.List) TableInfo(com.netflix.metacat.common.server.connectors.model.TableInfo) TableAlreadyExistsException(com.netflix.metacat.common.server.connectors.exception.TableAlreadyExistsException) TableType(org.apache.hadoop.hive.metastore.TableType) ConnectorUtils(com.netflix.metacat.common.server.connectors.ConnectorUtils) Comparator(java.util.Comparator) NoSuchObjectException(org.apache.hadoop.hive.metastore.api.NoSuchObjectException) Sort(com.netflix.metacat.common.dto.Sort) TException(org.apache.thrift.TException) Table(org.apache.hadoop.hive.metastore.api.Table) QualifiedName(com.netflix.metacat.common.QualifiedName) DatabaseNotFoundException(com.netflix.metacat.common.server.connectors.exception.DatabaseNotFoundException) ConnectorException(com.netflix.metacat.common.server.connectors.exception.ConnectorException) TableInfo(com.netflix.metacat.common.server.connectors.model.TableInfo) MetaException(org.apache.hadoop.hive.metastore.api.MetaException) InvalidMetaException(com.netflix.metacat.common.server.connectors.exception.InvalidMetaException)

Example 4 with NonNull

use of lombok.NonNull in project cas by apereo.

the class PolicyBasedAuthenticationManager method authenticateInternal.

/**
 * Authenticate internal authentication builder.
 *
 * @param transaction the transaction
 * @return the authentication builder
 * @throws AuthenticationException the authentication exception
 */
protected AuthenticationBuilder authenticateInternal(final AuthenticationTransaction transaction) throws AuthenticationException {
    final Collection<Credential> credentials = transaction.getCredentials();
    LOGGER.debug("Authentication credentials provided for this transaction are [{}]", credentials);
    if (credentials.isEmpty()) {
        LOGGER.error("Resolved authentication handlers for this transaction are empty");
        throw new AuthenticationException("Resolved credentials for this transaction are empty");
    }
    final AuthenticationBuilder builder = new DefaultAuthenticationBuilder(NullPrincipal.getInstance());
    credentials.stream().forEach(cred -> builder.addCredential(new BasicCredentialMetaData(cred)));
    @NonNull final Set<AuthenticationHandler> handlerSet = getAuthenticationHandlersForThisTransaction(transaction);
    LOGGER.debug("Candidate resolved authentication handlers for this transaction are [{}]", handlerSet);
    if (handlerSet.isEmpty()) {
        LOGGER.error("Resolved authentication handlers for this transaction are empty");
        throw new AuthenticationException(builder.getFailures(), builder.getSuccesses());
    }
    try {
        final Iterator<Credential> it = credentials.iterator();
        AuthenticationCredentialsThreadLocalBinder.clearInProgressAuthentication();
        while (it.hasNext()) {
            final Credential credential = it.next();
            LOGGER.debug("Attempting to authenticate credential [{}]", credential);
            final Iterator<AuthenticationHandler> itHandlers = handlerSet.iterator();
            boolean proceedWithNextHandler = true;
            while (proceedWithNextHandler && itHandlers.hasNext()) {
                final AuthenticationHandler handler = itHandlers.next();
                if (handler.supports(credential)) {
                    try {
                        final PrincipalResolver resolver = getPrincipalResolverLinkedToHandlerIfAny(handler, transaction);
                        LOGGER.debug("Attempting authentication of [{}] using [{}]", credential.getId(), handler.getName());
                        authenticateAndResolvePrincipal(builder, credential, resolver, handler);
                        AuthenticationCredentialsThreadLocalBinder.bindInProgress(builder.build());
                        final Pair<Boolean, Set<Throwable>> failures = evaluateAuthenticationPolicies(builder.build(), transaction);
                        proceedWithNextHandler = !failures.getKey();
                    } catch (final Exception e) {
                        LOGGER.error("Authentication has failed. Credentials may be incorrect or CAS cannot " + "find authentication handler that supports [{}] of type [{}]. Examine the configuration to " + "ensure a method of authentication is defined and analyze CAS logs at DEBUG level to trace " + "the authentication event.", credential, credential.getClass().getSimpleName());
                        handleAuthenticationException(e, handler.getName(), builder);
                        proceedWithNextHandler = true;
                    }
                } else {
                    LOGGER.debug("Authentication handler [{}] does not support the credential type [{}]. Trying next...", handler.getName(), credential);
                }
            }
        }
        evaluateFinalAuthentication(builder, transaction);
        return builder;
    } finally {
        AuthenticationCredentialsThreadLocalBinder.clearInProgressAuthentication();
    }
}
Also used : LinkedHashSet(java.util.LinkedHashSet) Set(java.util.Set) UnresolvedPrincipalException(org.apereo.cas.authentication.exceptions.UnresolvedPrincipalException) GeneralSecurityException(java.security.GeneralSecurityException) UndeclaredThrowableException(java.lang.reflect.UndeclaredThrowableException) PrincipalResolver(org.apereo.cas.authentication.principal.PrincipalResolver) NonNull(lombok.NonNull)

Example 5 with NonNull

use of lombok.NonNull in project cas by apereo.

the class IdPInitiatedProfileHandlerController method handleIdPInitiatedSsoRequest.

/**
 * Handle idp initiated sso requests.
 *
 * @param response the response
 * @param request  the request
 * @throws Exception the exception
 */
@GetMapping(path = SamlIdPConstants.ENDPOINT_SAML2_IDP_INIT_PROFILE_SSO)
protected void handleIdPInitiatedSsoRequest(final HttpServletResponse response, final HttpServletRequest request) throws Exception {
    // The name (i.e., the entity ID) of the service provider.
    final String providerId = CommonUtils.safeGetParameter(request, SamlIdPConstants.PROVIDER_ID);
    if (StringUtils.isBlank(providerId)) {
        LOGGER.warn("No providerId parameter given in unsolicited SSO authentication request.");
        throw new MessageDecodingException("No providerId parameter given in unsolicited SSO authentication request.");
    }
    final SamlRegisteredService registeredService = verifySamlRegisteredService(providerId);
    final Optional<SamlRegisteredServiceServiceProviderMetadataFacade> adaptor = getSamlMetadataFacadeFor(registeredService, providerId);
    if (!adaptor.isPresent()) {
        throw new UnauthorizedServiceException(UnauthorizedServiceException.CODE_UNAUTHZ_SERVICE, "Cannot find metadata linked to " + providerId);
    }
    // The URL of the response location at the SP (called the "Assertion Consumer Service")
    // but can be omitted in favor of the IdP picking the default endpoint location from metadata.
    String shire = CommonUtils.safeGetParameter(request, SamlIdPConstants.SHIRE);
    final SamlRegisteredServiceServiceProviderMetadataFacade facade = adaptor.get();
    if (StringUtils.isBlank(shire)) {
        LOGGER.warn("Resolving service provider assertion consumer service URL for [{}] and binding [{}]", providerId, SAMLConstants.SAML2_POST_BINDING_URI);
        @NonNull final AssertionConsumerService acs = facade.getAssertionConsumerService(SAMLConstants.SAML2_POST_BINDING_URI);
        shire = acs.getLocation();
    }
    if (StringUtils.isBlank(shire)) {
        LOGGER.warn("Unable to resolve service provider assertion consumer service URL for AuthnRequest construction for entityID: [{}]", providerId);
        throw new MessageDecodingException("Unable to resolve SP ACS URL for AuthnRequest construction");
    }
    // The target resource at the SP, or a state token generated by an SP to represent the resource.
    final String target = CommonUtils.safeGetParameter(request, SamlIdPConstants.TARGET);
    // A timestamp to help with stale request detection.
    final String time = CommonUtils.safeGetParameter(request, SamlIdPConstants.TIME);
    final SAMLObjectBuilder builder = (SAMLObjectBuilder) configBean.getBuilderFactory().getBuilder(AuthnRequest.DEFAULT_ELEMENT_NAME);
    final AuthnRequest authnRequest = (AuthnRequest) builder.buildObject();
    authnRequest.setAssertionConsumerServiceURL(shire);
    final SAMLObjectBuilder isBuilder = (SAMLObjectBuilder) configBean.getBuilderFactory().getBuilder(Issuer.DEFAULT_ELEMENT_NAME);
    final Issuer issuer = (Issuer) isBuilder.buildObject();
    issuer.setValue(providerId);
    authnRequest.setIssuer(issuer);
    authnRequest.setProtocolBinding(SAMLConstants.SAML2_POST_BINDING_URI);
    final SAMLObjectBuilder pBuilder = (SAMLObjectBuilder) configBean.getBuilderFactory().getBuilder(NameIDPolicy.DEFAULT_ELEMENT_NAME);
    final NameIDPolicy nameIDPolicy = (NameIDPolicy) pBuilder.buildObject();
    nameIDPolicy.setAllowCreate(Boolean.TRUE);
    authnRequest.setNameIDPolicy(nameIDPolicy);
    if (NumberUtils.isCreatable(time)) {
        authnRequest.setIssueInstant(new DateTime(TimeUnit.SECONDS.convert(Long.parseLong(time), TimeUnit.MILLISECONDS), ISOChronology.getInstanceUTC()));
    } else {
        authnRequest.setIssueInstant(new DateTime(DateTime.now(), ISOChronology.getInstanceUTC()));
    }
    authnRequest.setForceAuthn(Boolean.FALSE);
    if (StringUtils.isNotBlank(target)) {
        request.setAttribute(SamlProtocolConstants.PARAMETER_SAML_RELAY_STATE, target);
    }
    final MessageContext ctx = new MessageContext();
    ctx.setAutoCreateSubcontexts(true);
    if (facade.isAuthnRequestsSigned()) {
        samlObjectSigner.encode(authnRequest, registeredService, facade, response, request, SAMLConstants.SAML2_POST_BINDING_URI);
    }
    ctx.setMessage(authnRequest);
    ctx.getSubcontext(SAMLBindingContext.class, true).setHasBindingSignature(false);
    final Pair<SignableSAMLObject, MessageContext> pair = Pair.of(authnRequest, ctx);
    initiateAuthenticationRequest(pair, response, request);
}
Also used : SAMLBindingContext(org.opensaml.saml.common.messaging.context.SAMLBindingContext) SAMLObjectBuilder(org.opensaml.saml.common.SAMLObjectBuilder) Issuer(org.opensaml.saml.saml2.core.Issuer) NameIDPolicy(org.opensaml.saml.saml2.core.NameIDPolicy) SamlRegisteredServiceServiceProviderMetadataFacade(org.apereo.cas.support.saml.services.idp.metadata.SamlRegisteredServiceServiceProviderMetadataFacade) UnauthorizedServiceException(org.apereo.cas.services.UnauthorizedServiceException) DateTime(org.joda.time.DateTime) MessageDecodingException(org.opensaml.messaging.decoder.MessageDecodingException) AuthnRequest(org.opensaml.saml.saml2.core.AuthnRequest) SignableSAMLObject(org.opensaml.saml.common.SignableSAMLObject) SamlRegisteredService(org.apereo.cas.support.saml.services.SamlRegisteredService) NonNull(lombok.NonNull) AssertionConsumerService(org.opensaml.saml.saml2.metadata.AssertionConsumerService) MessageContext(org.opensaml.messaging.context.MessageContext) GetMapping(org.springframework.web.bind.annotation.GetMapping)

Aggregations

NonNull (lombok.NonNull)52 List (java.util.List)31 Collectors (java.util.stream.Collectors)22 Map (java.util.Map)20 Slf4j (lombok.extern.slf4j.Slf4j)18 ArrayList (java.util.ArrayList)17 Collection (java.util.Collection)17 HashMap (java.util.HashMap)16 lombok.val (lombok.val)14 Collections (java.util.Collections)13 Duration (java.time.Duration)11 Nullable (javax.annotation.Nullable)11 TimeoutTimer (io.pravega.common.TimeoutTimer)10 Set (java.util.Set)10 CompletableFuture (java.util.concurrent.CompletableFuture)10 Function (java.util.function.Function)10 Futures (io.pravega.common.concurrent.Futures)9 Getter (lombok.Getter)9 Preconditions (com.google.common.base.Preconditions)8 Strings (com.google.common.base.Strings)8