Search in sources :

Example 1 with FSTableDescriptors

use of org.apache.hadoop.hbase.util.FSTableDescriptors in project hbase by apache.

the class HBaseTestCase method createMetaRegion.

/**
   * You must call {@link #closeRootAndMeta()} when done after calling this
   * method. It does cleanup.
   * @throws IOException
   */
protected void createMetaRegion() throws IOException {
    FSTableDescriptors fsTableDescriptors = new FSTableDescriptors(conf);
    meta = HBaseTestingUtility.createRegionAndWAL(HRegionInfo.FIRST_META_REGIONINFO, testDir, conf, fsTableDescriptors.get(TableName.META_TABLE_NAME));
}
Also used : FSTableDescriptors(org.apache.hadoop.hbase.util.FSTableDescriptors)

Example 2 with FSTableDescriptors

use of org.apache.hadoop.hbase.util.FSTableDescriptors in project hbase by apache.

the class SnapshotManifest method consolidate.

public void consolidate() throws IOException {
    if (getSnapshotFormat(desc) == SnapshotManifestV1.DESCRIPTOR_VERSION) {
        Path rootDir = FSUtils.getRootDir(conf);
        LOG.info("Using old Snapshot Format");
        // write a copy of descriptor to the snapshot directory
        new FSTableDescriptors(conf, fs, rootDir).createTableDescriptorForTableDirectory(workingDir, htd, false);
    } else {
        LOG.debug("Convert to Single Snapshot Manifest");
        convertToV2SingleManifest();
    }
}
Also used : Path(org.apache.hadoop.fs.Path) FSTableDescriptors(org.apache.hadoop.hbase.util.FSTableDescriptors)

Example 3 with FSTableDescriptors

use of org.apache.hadoop.hbase.util.FSTableDescriptors in project hbase by apache.

the class TestHRegionInfo method testReadAndWriteHRegionInfoFile.

@Test
public void testReadAndWriteHRegionInfoFile() throws IOException, InterruptedException {
    HBaseTestingUtility htu = new HBaseTestingUtility();
    HRegionInfo hri = HRegionInfo.FIRST_META_REGIONINFO;
    Path basedir = htu.getDataTestDir();
    // Create a region.  That'll write the .regioninfo file.
    FSTableDescriptors fsTableDescriptors = new FSTableDescriptors(htu.getConfiguration());
    HRegion r = HBaseTestingUtility.createRegionAndWAL(hri, basedir, htu.getConfiguration(), fsTableDescriptors.get(TableName.META_TABLE_NAME));
    // Get modtime on the file.
    long modtime = getModTime(r);
    HBaseTestingUtility.closeRegionAndWAL(r);
    Thread.sleep(1001);
    r = HRegion.openHRegion(basedir, hri, fsTableDescriptors.get(TableName.META_TABLE_NAME), null, htu.getConfiguration());
    // Ensure the file is not written for a second time.
    long modtime2 = getModTime(r);
    assertEquals(modtime, modtime2);
    // Now load the file.
    HRegionInfo deserializedHri = HRegionFileSystem.loadRegionInfoFileContent(r.getRegionFileSystem().getFileSystem(), r.getRegionFileSystem().getRegionDir());
    assertTrue(hri.equals(deserializedHri));
    HBaseTestingUtility.closeRegionAndWAL(r);
}
Also used : HRegionInfo(org.apache.hadoop.hbase.HRegionInfo) Path(org.apache.hadoop.fs.Path) HBaseTestingUtility(org.apache.hadoop.hbase.HBaseTestingUtility) FSTableDescriptors(org.apache.hadoop.hbase.util.FSTableDescriptors) Test(org.junit.Test)

Example 4 with FSTableDescriptors

use of org.apache.hadoop.hbase.util.FSTableDescriptors in project hbase by apache.

the class TestFSTableDescriptorForceCreation method testShouldCreateNewTableDescriptorIfForcefulCreationIsFalse.

@Test
public void testShouldCreateNewTableDescriptorIfForcefulCreationIsFalse() throws IOException {
    final String name = this.name.getMethodName();
    FileSystem fs = FileSystem.get(UTIL.getConfiguration());
    Path rootdir = new Path(UTIL.getDataTestDir(), name);
    FSTableDescriptors fstd = new FSTableDescriptors(fs, rootdir);
    assertTrue("Should create new table descriptor", fstd.createTableDescriptor(TableDescriptorBuilder.newBuilder(TableName.valueOf(name)).build(), false));
}
Also used : Path(org.apache.hadoop.fs.Path) FileSystem(org.apache.hadoop.fs.FileSystem) FSTableDescriptors(org.apache.hadoop.hbase.util.FSTableDescriptors) Test(org.junit.Test)

Example 5 with FSTableDescriptors

use of org.apache.hadoop.hbase.util.FSTableDescriptors in project hbase by apache.

the class TestFSTableDescriptorForceCreation method testShouldAllowForcefulCreationOfAlreadyExistingTableDescriptor.

@Test
public void testShouldAllowForcefulCreationOfAlreadyExistingTableDescriptor() throws Exception {
    final String name = this.name.getMethodName();
    FileSystem fs = FileSystem.get(UTIL.getConfiguration());
    Path rootdir = new Path(UTIL.getDataTestDir(), name);
    FSTableDescriptors fstd = new FSTableDescriptors(fs, rootdir);
    TableDescriptor htd = TableDescriptorBuilder.newBuilder(TableName.valueOf(name)).build();
    fstd.createTableDescriptor(htd, false);
    assertTrue("Should create new table descriptor", fstd.createTableDescriptor(htd, true));
}
Also used : Path(org.apache.hadoop.fs.Path) FileSystem(org.apache.hadoop.fs.FileSystem) FSTableDescriptors(org.apache.hadoop.hbase.util.FSTableDescriptors) TableDescriptor(org.apache.hadoop.hbase.client.TableDescriptor) Test(org.junit.Test)

Aggregations

FSTableDescriptors (org.apache.hadoop.hbase.util.FSTableDescriptors)21 Path (org.apache.hadoop.fs.Path)16 Test (org.junit.Test)10 TableDescriptor (org.apache.hadoop.hbase.client.TableDescriptor)6 FileSystem (org.apache.hadoop.fs.FileSystem)5 RegionInfo (org.apache.hadoop.hbase.client.RegionInfo)5 MasterFileSystem (org.apache.hadoop.hbase.master.MasterFileSystem)4 Configuration (org.apache.hadoop.conf.Configuration)3 HRegionInfo (org.apache.hadoop.hbase.HRegionInfo)3 HTableDescriptor (org.apache.hadoop.hbase.HTableDescriptor)3 HBaseConfiguration (org.apache.hadoop.hbase.HBaseConfiguration)2 HBaseTestingUtil (org.apache.hadoop.hbase.HBaseTestingUtil)2 HBaseTestingUtility (org.apache.hadoop.hbase.HBaseTestingUtility)2 TableDescriptors (org.apache.hadoop.hbase.TableDescriptors)2 WALFactory (org.apache.hadoop.hbase.wal.WALFactory)2 IOException (java.io.IOException)1 ArrayList (java.util.ArrayList)1 Cell (org.apache.hadoop.hbase.Cell)1 HColumnDescriptor (org.apache.hadoop.hbase.HColumnDescriptor)1 MiniHBaseCluster (org.apache.hadoop.hbase.MiniHBaseCluster)1