Using AlchemyAPI to Classify Blog Posts–Part 1
August 7, 2012 at 11:57 AM
Obtaining the API Classes
To start using AlchemyAPI, go here to sign up for a user key. After filling in the form, a key will be mailed to you that can be used for 1000 API calls per day. Next, you'll want to download the .Net classes you can use to simplify the calls to the API. You can obtain the classes, here. What you receive, in the zip file, is the source code for the classes but no project or solution file. So, open Visual Studio and create a new Class Library project. Then, import the classes you downloaded into this new project. You should end up with something like below:
You'll notice the top class, AlchemyAPI, and then several "*Params" classes. The first class contains methods for making the various calls to the API and each type of call has two overloads. One uses the default configuration for the call and the other allows you to override the configuration by passing in one of the "*Params" classes. You can find the default values for each call in the AlchemyAPI documentation for the particular call. If you wish to override any of the defaults, simply populate the proper params class and pass it into the call.
I made one change to the classes supplied. Since I used TPL Dataflow to get parallel execution, I didn't want any threads blocking while waiting on a network call to return. For this reason, I added some async and await code.
In the AlchemyAPI class, near the bottom, you'll find a method called DoRequest(). This method is used to make any API calls using an instance of WebRequest. To make the calls async, I first changed the method signature by adding the async keyword.
- private async Task<string> DoRequest(HttpWebRequest wreq, AlchemyAPI_BaseParams.OutputMode outputMode)
Next, I simply changes the actual web call to the following.
- using (HttpWebResponse wres = wreq.GetResponse() as HttpWebResponse)
- StreamReader r = new StreamReader(wres.GetResponseStream());
- xml = await r.ReadToEndAsync();
The above prepares the API for our purposes and puts us in a position to begin constructing the rest of our application. Next time, we'll look at TPL Dataflow and how it can be used in this scenario and how the application was designed around TDF's capabilities.