Search in sources :

Example 26 with UDFContext

use of org.apache.pig.impl.util.UDFContext in project mongo-hadoop by mongodb.

the class BSONStorage method checkSchema.

@Override
public void checkSchema(final ResourceSchema schema) throws IOException {
    this.schema = schema;
    UDFContext context = UDFContext.getUDFContext();
    Properties p = context.getUDFProperties(getClass(), new String[] { udfcSignature });
    p.setProperty(SCHEMA_SIGNATURE, schema.toString());
}
Also used : UDFContext(org.apache.pig.impl.util.UDFContext) Properties(java.util.Properties)

Example 27 with UDFContext

use of org.apache.pig.impl.util.UDFContext in project mongo-hadoop by mongodb.

the class MongoInsertStorage method prepareToWrite.

public void prepareToWrite(final RecordWriter writer) throws IOException {
    out = writer;
    if (out == null) {
        throw new IOException("Invalid Record Writer");
    }
    UDFContext udfc = UDFContext.getUDFContext();
    Properties p = udfc.getUDFProperties(getClass(), new String[] { udfcSignature });
    String strSchema = p.getProperty(SCHEMA_SIGNATURE);
    if (strSchema == null) {
        LOG.warn("Could not find schema in UDF context. Interpreting each tuple as containing a single map.");
    } else {
        try {
            // Parse the schema from the string stored in the properties object.
            schema = new ResourceSchema(Utils.getSchemaFromString(strSchema));
        } catch (Exception e) {
            schema = null;
            LOG.warn(e.getMessage());
        }
        if (LOG.isDebugEnabled()) {
            LOG.debug("GOT A SCHEMA " + schema + " " + strSchema);
        }
    }
}
Also used : ResourceSchema(org.apache.pig.ResourceSchema) UDFContext(org.apache.pig.impl.util.UDFContext) IOException(java.io.IOException) Properties(java.util.Properties) IOException(java.io.IOException)

Example 28 with UDFContext

use of org.apache.pig.impl.util.UDFContext in project phoenix by apache.

the class ReserveNSequence method initConnection.

private void initConnection() throws IOException {
    // Create correct configuration to be used to make phoenix connections
    UDFContext context = UDFContext.getUDFContext();
    configuration = new Configuration(context.getJobConf());
    configuration.set(HConstants.ZOOKEEPER_QUORUM, this.zkQuorum);
    if (Strings.isNullOrEmpty(tenantId)) {
        configuration.unset(PhoenixRuntime.TENANT_ID_ATTRIB);
    } else {
        configuration.set(PhoenixRuntime.TENANT_ID_ATTRIB, tenantId);
    }
    try {
        connection = ConnectionUtil.getOutputConnection(configuration);
    } catch (SQLException e) {
        throw new IOException("Caught exception while creating connection", e);
    }
}
Also used : Configuration(org.apache.hadoop.conf.Configuration) SQLException(java.sql.SQLException) UDFContext(org.apache.pig.impl.util.UDFContext) IOException(java.io.IOException)

Aggregations

UDFContext (org.apache.pig.impl.util.UDFContext)28 Properties (java.util.Properties)23 IOException (java.io.IOException)14 ResourceSchema (org.apache.pig.ResourceSchema)4 Configuration (org.apache.hadoop.conf.Configuration)3 DataBag (org.apache.pig.data.DataBag)3 Tuple (org.apache.pig.data.Tuple)3 InterruptedException (java.lang.InterruptedException)2 Path (org.apache.hadoop.fs.Path)2 HCatException (org.apache.hive.hcatalog.common.HCatException)2 HCatSchema (org.apache.hive.hcatalog.data.schema.HCatSchema)2 TException (org.apache.thrift.TException)2 JsonParseException (org.codehaus.jackson.JsonParseException)2 JsonMappingException (org.codehaus.jackson.map.JsonMappingException)2 MongoUpdateWritable (com.mongodb.hadoop.io.MongoUpdateWritable)1 URI (java.net.URI)1 URISyntaxException (java.net.URISyntaxException)1 SQLException (java.sql.SQLException)1 ParseException (java.text.ParseException)1 ArrayList (java.util.ArrayList)1