Home > Software engineering >  Why Uncache table in spark-sql not working?
Why Uncache table in spark-sql not working?

Time:12-17

I'm learning Spark SQL, when I'm using spark-sql to uncache a table which has previously cached, but after submitted the uncache command, I can still query the cache table. Why this happened?

Spark version 3.2.0(Pre-built for Apache Hadoop 2.7)

Hadoop version 2.7.7

Hive metastore 2.3.9

Linux Info

Static hostname: master
     Icon name: computer-vm
       Chassis: vm
    Machine ID: 15c**********************10b2e19
       Boot ID: 48b**********************efc169b
Virtualization: vmware

Operating System: Ubuntu 18.04.6 LTS Kernel: Linux 4.15.0-163-generic Architecture: x86-64

spark-sql (default)> CACHE TABLE testCache SELECT * FROM students WHERE AGE = 13;
Error in query: Temporary view 'testCache' already exists
spark-sql (default)> UNCACHE TABLE testCache;
Response code
Time taken: 0.092 seconds
spark-sql (default)> SELECT * FROM testCache;
NAME    rollno  AGE
Kent    8   21
Marry   1   10
Eddie Davis 5   13
Amy Smith   3   13
Barron  3   12
Fleur Laurent   4   9
Ivy 3   8
Time taken: 0.492 seconds, Fetched 7 row(s)

CodePudding user response:

UNCACHE TABLE removes the entries and associated data from the in-memory and/or on-disk cache for a given table or view, not drop the table. So you can still query it.

  • Related