开发者社区> 问答> 正文

spark-sql:查询临时表时异常(错误No Typetag)?报错

代码用途:简单测试spark-sql的读hdfs文件,然后用sql 简便提取临时数据

报错猜测:个人刚开始怀疑是pom文件依赖的版本有问题,现在也不确定啥问题,具体描述如下: 

1:集群环境spark-shell下执行,很正常,可以很欢快的使用,可以查询people结果。。

val sqlContext = new org.apache.spark.sql.SQLContext(sc)
case class Person(name: String)
var people = sc.textFile("/workspace/xx/tmp/a").map(_.split(" ")).map(p => Person(p(0)))
val peopleSchema = sqlContext.createSchemaRDD(people)
peopleSchema.registerTempTable("people")
var df=sqlContext.sql("select * from people")
2:在本地打包时发现报错,不能编译。
import org.apache.spark.sql.SQLContext
import org.apache.spark.{SparkContext, SparkConf}
import org.apache.spark.SparkContext._
import org.apache.spark.sql
import org.apache.spark.sql._
import org.apache.hadoop.fs.Path
import org.apache.spark.sql.SchemaRDD

object sparkSql {
  def main(args: Array[String]) {
    val conf = new SparkConf().setAppName("sql_data")
    var sc = new SparkContext(conf)
    val sqlContext = new org.apache.spark.sql.SQLContext(sc)
    case class Person(name: String)
    var people = sc.textFile("/workspace/xx/tmp/a").map(_.split(" ")).map(p => Person(p(0)))
    val peopleSchema = sqlContext.createSchemaRDD(people)
    peopleSchema.registerTempTable("people")
    var df=sqlContext.sql("select * from people")
    df.map(t => t(0)).collect().foreach(println)
  }
}
报错信息:
Information:2015/4/17 20:29 - Compilation completed with 2 errors and 0 warnings in 1 sec
E:\sparkSql.scala
Error:(26, 50) No TypeTag available for Person
    val peopleSchema = sqlContext.createSchemaRDD(people)                                                ^
Error:(26, 50) not enough arguments for method createSchemaRDD: (implicit evidence$1: reflect.runtime.universe.TypeTag[Person])org.apache.spark.sql.SchemaRDD.
Unspecified value parameter evidence$1.

    val peopleSchema = sqlContext.createSchemaRDD(people)

3:pom 

<dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.10</artifactId>
            <version>1.1.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-sql_2.10</artifactId>
            <version>1.1.0</version>
        </dependency>



展开
收起
爱吃鱼的程序员 2020-06-14 16:38:00 1160 0
1 条回答
写回答
取消 提交回答
  • https://developer.aliyun.com/profile/5yerqm5bn5yqg?spm=a2c6h.12873639.0.0.6eae304abcjaIB

    在本地代码中有修改了,依旧报同样的错误!

    importsqlContext.createSchemaRDDcaseclassPerson(name:String)



    把case放外边就好了。。。啦啦啦啦啦!

    Justmoveyourcaseclassoutofthemethoddefinition

    http://stackoverflow.com/questions/29143756/scala-spark-app-with-no-typetag-available-error-in-def-main-style-app

    thanks 
    2020-06-14 16:38:16
    赞同 展开评论 打赏
问答排行榜
最热
最新

相关电子书

更多
Hybrid Cloud and Apache Spark 立即下载
Scalable Deep Learning on Spark 立即下载
Comparison of Spark SQL with Hive 立即下载