Packages

c

geotrellis.spark.store.hadoop.cog

HadoopCOGSparkLayerProvider

class HadoopCOGSparkLayerProvider extends HadoopCOGCollectionLayerProvider with COGLayerReaderProvider with COGLayerWriterProvider

Provides HadoopAttributeStore instance for URI with hdfs, hdfs+file, s3n, s3a, wasb and wasbs schemes. The uri represents Hadoop Path of catalog root. wasb and wasbs provide support for the Hadoop Azure connector. Additional configuration is required for this. This Provider intentinally does not handle the s3 scheme because the Hadoop implemintation is poor. That support is provided by HadoopAttributeStore

Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. HadoopCOGSparkLayerProvider
  2. COGLayerWriterProvider
  3. COGLayerReaderProvider
  4. HadoopCOGCollectionLayerProvider
  5. COGCollectionLayerReaderProvider
  6. COGValueReaderProvider
  7. AttributeStoreProvider
  8. AnyRef
  9. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new HadoopCOGSparkLayerProvider()

Value Members

  1. def attributeStore(uri: URI): AttributeStore
  2. def canProcess(uri: URI): Boolean
  3. def collectionLayerReader(uri: URI, store: AttributeStore): HadoopCOGCollectionLayerReader
  4. def layerReader(uri: URI, store: AttributeStore, sc: SparkContext): COGLayerReader[LayerId]
  5. def layerWriter(uri: URI, store: AttributeStore): COGLayerWriter
  6. def valueReader(uri: URI, store: AttributeStore): COGValueReader[LayerId]