我假设需要其他依赖项,但我无法确切了解。我感谢大家的帮助。
我的pom文件是:
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.spark</groupId>
<artifactId>spark</artifactId>
<version>0.0.1-SNAPSHOT</version>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<name>M101J</name>
<url>http://maven.apache.org</url>
<dependencies>
<dependency>
<groupId>com.sparkjava</groupId>
<artifactId>spark-core</artifactId>
<version>1.1.1</version>
</dependency>
<dependency>
<groupId>org.freemarker</groupId>
<artifactId>freemarker</artifactId>
<version>2.3.19</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<version>1.6.4</version>
</dependency>
</dependencies>
<repositories>
<repository>
<id>Spark repository</id>
<url>http://sparkjava.com/nexus/content/repositories/spark/</url>
</repository>
</repositories>
<build>
<plugins>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<configuration>
<archive>
<manifest>
<mainClass>com.spark.SparkHomework</mainClass>
</manifest>
</archive>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
</plugin>
</plugins>
</build>
</project>
当我尝试使用命令运行项目时:
mvn compile exec:java -Dexec.mainClass=com.spark.SparkHomework
我有这样的问题:
未知生命周期阶段“ .mainClass = com.spark.SparkHomework”。您必须以:或:[:]:格式指定有效的生命周期阶段或目标。可用的生命周期阶段包括:验证,初始化,生成源,流程源,生成资源,流程资源,编译,流程类,生成测试源,流程测试源,生成测试资源,流程-test-resources,test-compile,process-test-classes,test,prepare-package,package,pre-integration-test,integration-test,post-integration-test,验证,安装,部署,预清洁,清洁,清理后,站点前,站点,站点后,站点部署。
我的名为SparkHomework的类(名为com.spark的软件包)是:
package com.spark;
import freemarker.template.Configuration;
import freemarker.template.Template;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import spark.Request;
import spark.Response;
import spark.Route;
import spark.Spark;
import java.io.StringWriter;
import java.net.UnknownHostException;
import java.util.HashMap;
import java.util.Map;
public class SparkHomework {
private static final Logger logger = LoggerFactory.getLogger("logger");
public static void main(String[] args) throws UnknownHostException {
final Configuration configuration = new Configuration();
configuration.setClassForTemplateLoading(
SparkHomework.class, "/");
Spark.get(new Route("/") {
@Override
public Object handle(final Request request,
final Response response) {
StringWriter writer = new StringWriter();
try {
Template helloTemplate = configuration.getTemplate("answer.ftl");
Map<String, String> answerMap = new HashMap<String, String>();
answerMap.put("answer", createAnswer());
helloTemplate.process(answerMap, writer);
} catch (Exception e) {
logger.error("Failed", e);
halt(500);
}
return writer;
}
});
}
private static String createAnswer() {
int i = 0;
for (int bit = 0; bit < 16; bit++) {
i |= bit << bit;
}
return Integer.toString(i);
}
}
解决了!在Powershell中,需要添加引号-D"exec.mainClass"
。在命令提示符下-没有它们,一切都很好。感谢大家!命令提示符和powershell。
本文收集自互联网,转载请注明来源。
如有侵权,请联系 [email protected] 删除。
我来说两句