Home > other >  Spark dynamic library loading using a third party. So the file
Spark dynamic library loading using a third party. So the file

Time:09-22

Here have a task, through audio to parse the age and gender, the need to use a third party in the process of dynamic library (. So file), consult a great god how to load and use third-party libraries in the spark?

CodePudding user response:

Spark of each node is also a Linux host,. So the file on each node, engineering through Java calls. So give it a try

CodePudding user response:

This method is so stupid, what I want is by spark code loaded to come in, and then publish to all computing nodes, analogous to the hadoop DistributedCache. The addFileToClassPath () function,

CodePudding user response:

. I am using python, pyspark SparkContext addFile (path) can be used to distribute the file to the compute nodes for call (Add a file to be downloaded with this Spark job on every node), so the library call no problem, I used

CodePudding user response:

Consult how to solve the building Lord

CodePudding user response:

How to solve,
http://blog.csdn.net/ddjj_1980/article/details/74940593
  • Related