How do I get more information about the entities from Wikipedia and DBpedia?
For wikipedia you can use the MediaWiki APIs.
For DBpedia you can add the value lod to the parameter include (see the documentation) in order to obtain the url to the DBpedia resource of each entity. You can then request a Linked Data representation of the DBpedia resources in different formats by using the appropriate HTTP "Accept" header, or directly query the DBpedia SPARQL endpoint.
Can I use Dandelion API for Sentiment Analysis? What about feature xy?
We released the Sentiment Analysis API during July 2015. (read the documenntation).
When will Dandelion API support language XY?
The short answer is: when there is enough demand for language XY. We officially support 7 languages, but we have a beta support for 40 more languages, so we currently cover most of the spoken languages in the world!
I cannot send my secret data over the internet, can I buy a copy of Dandelion API to install in my own datacenter?
We need to discuss it with you on a case by case basis, please contact us at email@example.com to discuss your requirements.
What the @fx!# is a unit? How does the pricing work?
You get 1000 units per day for free, if you want more you need to subscribe to a paid plan.
Each API has its own cost in units, for example a request to the Entity Extraction API, costs 1 unit. Other APIs cost more or less units per hit.
To know how many units you used for the day and how many you have left, check your account dashboard. Every API call also tells you its cost in units and the number of your remaining units for the day, check the docs for more details.