2016-08-01 143 views
6

我试图在Maven上使用scalatestspark-testing-base进行Spark集成测试。 Spark作业读取CSV文件,验证结果并将数据插入数据库。我试图通过放入已知格式的文件来测试验证,看看它们是否以及如何失败。这个特定的测试只是确保验证通过。不幸的是,scalatest找不到我的测试。Scalatest Maven插件“没有执行测试”

相关POM插件:

而这里的测试类:

class ProficiencySchemaITest extends FlatSpec with Matchers with SharedSparkContext with BeforeAndAfter { 
    private var schemaStrategy: SchemaStrategy = _ 
    private var dataReader: DataFrameReader = _ 

    before { 
     val sqlContext = new SQLContext(sc) 
     import sqlContext._ 
     import sqlContext.implicits._ 

     val dataInReader = sqlContext.read.format("com.databricks.spark.csv") 
              .option("header", "true") 
              .option("nullValue", "") 
     schemaStrategy = SchemaStrategyChooser("dim_state_test_proficiency") 
     dataReader = schemaStrategy.applySchema(dataInReader) 
    } 

    "Proficiency Validation" should "pass with the CSV file proficiency-valid.csv" in { 
     val dataIn = dataReader.load("src/test/resources/proficiency-valid.csv") 

     val valid: Try[DataFrame] = Try(schemaStrategy.validateCsv(dataIn)) 
     valid match { 
      case Success(v) =>() 
      case Failure(e) => fail("Validation failed on what should have been a clean file: ", e) 
     } 
    } 
} 

当我运行mvn test,它找不到任何测试,并输出这样的信息:

[INFO] --- scalatest-maven-plugin:1.0:test (test) @ load-csv-into-db --- 
[36mDiscovery starting.[0m 
[36mDiscovery completed in 54 milliseconds.[0m 
[36mRun starting. Expected test count is: 0[0m 
[32mDiscoverySuite:[0m 
[36mRun completed in 133 milliseconds.[0m 
[36mTotal number of tests run: 0[0m 
[36mSuites: completed 1, aborted 0[0m 
[36mTests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0[0m 
[33mNo tests were executed.[0m 

UPDATE
使用:

<suites>com.cainc.data.etl.schema.proficiency.ProficiencySchemaITest</suites> 

相反的:

<wildcardSuites>com.cainc.data.etl.schema.proficiency</wildcardSuites> 

我可以得到一个测试运行。显然,这并不理想。有可能wildcardSuites被破坏;我打算在GitHub上打开一张票,看看会发生什么。

+0

'ProficiencySchemaITest'位于何处? – hasumedic

+0

src/test/scala/com/cainc/data/etl/schema/proficiency/ProficiencySchemaITest.scala。我在我的目录中有: src/main/scala \t \t src/test/scala Azuaron

+0

我也可以在'''target /''创建'''.class'''文件。 – Azuaron

回答

2

这可能是因为项目路径中有一些空格字符。 删除项目路径中的空间,可以成功发现测试。 希望得到这个帮助。

+0

完整路径:C:\ dev \ projects \ nintendo \ spark-data-load \ src \ main \ blah \ blah \ blah – Azuaron

+0

项目路径中的空格问题应在此PR中修复:https:// github。 COM/scalatest/scalatest - Maven的插件/拉/ 37 – cstroe

0

尝试排除junit作为传递依赖项。适用于我。下面的示例,但请注意,Scala和Spark版本特定于我的环境。

<dependency> 
     <groupId>com.holdenkarau</groupId> 
     <artifactId>spark-testing-base_2.10</artifactId> 
     <version>1.5.0_0.6.0</version> 
     <scope>test</scope> 
     <exclusions> 
      <!-- junit is not compatible with scalatest --> 
      <exclusion> 
       <groupId>junit</groupId> 
       <artifactId>junit</artifactId> 
      </exclusion> 
     </exclusion> 
    </dependency>