Object/Class

org.locationtech.geomesa.jobs.mapreduce

GeoMesaInputFormat

Related Docs: class GeoMesaInputFormat | package mapreduce

Permalink

object GeoMesaInputFormat extends LazyLogging

Annotations
@experimental()
Linear Supertypes
LazyLogging, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. GeoMesaInputFormat
  2. LazyLogging
  3. AnyRef
  4. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. val SYS_PROP_SPARK_LOAD_CP: String

    Permalink
  5. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  6. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  7. def configure(job: Job, dsParams: Map[String, String], query: Query): Unit

    Permalink

    EXPERIMENTAL@experimental Configure the input format.

    EXPERIMENTAL@experimental Configure the input format.

    This is a single method, as we have to calculate several things to pass to the underlying AccumuloInputFormat, and there is not a good hook to indicate when the config is finished.

    Annotations
    @experimental()
  8. def configure(job: Job, dsParams: Map[String, String], featureTypeName: String, filter: Option[String] = None, transform: Option[Array[String]] = None): Unit

    Permalink
    Annotations
    @experimental()
  9. def ensureSparkClasspath(): Unit

    Permalink

    EXPERIMENTAL@experimental This takes any jars that have been loaded by spark in the context classloader and makes them available to the general classloader.

    EXPERIMENTAL@experimental This takes any jars that have been loaded by spark in the context classloader and makes them available to the general classloader. This is required as not all classes (even spark ones) check the context classloader.

    Annotations
    @experimental()
  10. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  11. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  12. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  13. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  14. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  15. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  16. lazy val logger: Logger

    Permalink
    Attributes
    protected
    Definition Classes
    LazyLogging
  17. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  18. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  19. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  20. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  21. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  22. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  23. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  24. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )

Inherited from LazyLogging

Inherited from AnyRef

Inherited from Any

Ungrouped