c

geotrellis.spark.store.hadoop

HadoopSparkLayerProvider

class HadoopSparkLayerProvider extends HadoopCollectionLayerProvider with LayerReaderProvider with LayerWriterProvider

Provides HadoopAttributeStore instance for URI with hdfs, hdfs+file, s3n, s3a, wasb and wasbs schemes. The uri represents Hadoop Path of catalog root. wasb and wasbs provide support for the Hadoop Azure connector. Additional configuration is required for this. This Provider intentinally does not handle the s3 scheme because the Hadoop implemintation is poor. That support is provided by HadoopAttributeStore

Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. HadoopSparkLayerProvider
  2. LayerWriterProvider
  3. LayerReaderProvider
  4. HadoopCollectionLayerProvider
  5. CollectionLayerReaderProvider
  6. ValueReaderProvider
  7. AttributeStoreProvider
  8. AnyRef
  9. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Instance Constructors

  1. new HadoopSparkLayerProvider()

Value Members

  1. def attributeStore(uri: URI): AttributeStore
  2. def canProcess(uri: URI): Boolean
  3. def collectionLayerReader(uri: URI, store: AttributeStore): HadoopCollectionLayerReader
  4. def layerReader(uri: URI, store: AttributeStore, sc: SparkContext): FilteringLayerReader[LayerId]
  5. def layerWriter(uri: URI, store: AttributeStore): LayerWriter[LayerId]
  6. def valueReader(uri: URI, store: AttributeStore): ValueReader[LayerId]