Home > Software engineering >  package spark does not exist
package spark does not exist

Time:10-17

I'm just starting to learn Java.

I am using IntelliJ to create a SparkJava (see: sparkjava.com) app using openjdk 17. I followed the tutorial instructions outlined in "Setting up Spark with Maven" (https://sparkjava.com/tutorials/maven-setup). I believe the instructions are very outdated because they did not work. After some googling, I finally just arrived at the following code and POM.xml. When I build the project, I get an error: java: package spark does not exist

I don't know what to do.

  • I added the dependency to my POM.xml.
  • I added the apache.spark.core_2.13 library via Project Structure.

I googled "IntelliJ package does not exist" but couldn't find a helpful answer. Most everyone said to "add the dependency to the POM" but I've already done that. I googled "add package to Java project IntelliJ" but after clicking a number of linkis, I couldn't find a helpful answer. I tried a few of the suggestions, but none resolved this problem.

I think I am missing something fundamental here, like somehow I'm not telling IntelliJ where to find the spark code.

src/main/java/Sparky.java

import static spark.Spark.*;

public class Sparky {
    public static void main(String[] args){
        get("/hello", (req, res) -> "Hello World");
    }
}

pom.xml

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
         xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
         xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>org.example</groupId>
    <artifactId>sparky</artifactId>
    <version>1.0-SNAPSHOT</version>

    <properties>
        <maven.compiler.source>17</maven.compiler.source>
        <maven.compiler.target>17</maven.compiler.target>
    </properties>

    <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core -->

    <dependencies>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.13</artifactId>
            <version>3.2.0</version>
        </dependency>
    </dependencies>

CodePudding user response:

I guess you didn't configure IntelliJ for Maven properly. Either your project is not a Maven project (as far as IntelliJ is concerned, even if the structure of the project is valid for Maven), or you're missing an IntelliJ plugin or something. Look at IntelliJ's doc on how to create and manage a Maven project.

CodePudding user response:

The dependency package was completely wrong. It was pointing at Apache Spark, not Spark Java.

<dependency>
    <groupId>com.sparkjava</groupId>
    <artifactId>spark-core</artifactId>
    <version>2.9.3</version>
</dependency>
  • Related