All Apps and Add-ons

How can I run NLP-japanese with the Splunk App for Data Science and Deep Learning?

p2181832
New Member

Hello!

DSDL (Deeplearning toolkit) is set up and the Golden Image CPU container is started.
I ran the example "Entity Recognition and Extraction Example for Japanese using spaCy + Ginza Library" and an error occurred.

 

 

MLTKC error: /fit: ERROR: unable to initialize module. Ended with exception: No module named 'ja_ginza'
MLTKC parameters: {'params': {'algo': 'spacy_ner_ja', 'epochs': '100'}, 'args': ['text'], 'feature_variables': ['text'], 'model_name': 'spacy_ginza_entity_extraction_model', 'output_name': 'extracted', 'algo_name': 'MLTKContainer', 'mlspl_limits': {'handle_new_cat': 'default', 'max_distinct_cat_values': '100', 'max_distinct_cat_values_for_classifiers': '100', 'max_distinct_cat_values_for_scoring': '100', 'max_fit_time': '600', 'max_inputs': '100000', 'max_memory_usage_mb': '4000', 'max_model_size_mb': '30', 'max_score_time': '600', 'use_sampling': 'true'}, 'kfold_cv': None}

 

 

 Does the container "Golden Image CPU" support Japanese entity extraction?

Any help would be much appreciated.Thank you!!😀

Labels (1)
0 Karma
Get Updates on the Splunk Community!

Introducing the Splunk Community Dashboard Challenge!

Welcome to Splunk Community Dashboard Challenge! This is your chance to showcase your skills in creating ...

Get the T-shirt to Prove You Survived Splunk University Bootcamp

As if Splunk University, in Las Vegas, in-person, with three days of bootcamps and labs weren’t enough, now ...

Wondering How to Build Resiliency in the Cloud?

IT leaders are choosing Splunk Cloud as an ideal cloud transformation platform to drive business resilience,  ...