Coming from this issue on d.o: Evaluate third party libraries to replace drupal_http_request() and the browser in DrupalWebTestCase
Definitions listed in order of importance (top is important).
Parallel Requests: Project uses curl_multi_select(), stream_select(), or socket_select() when issuing multiple requests.
Non Blocking Requests: Open connection; write to it; do not wait to read, instead close connection.
Callback: Run custom code once that connection is done; call_user_func() or call_user_func_array().
PSR-0: Does the project follow the PSR-0 standard.
Symfony compatible objects: Request/response objects that are compatible with the Symfony ones.
Pool/Throttle: Limit domain & total number of connections. If issuing 10k requests don't do them all at the same time.
Proxy Support: Can requests be tunneled through a proxy?
Cookie Parsing: Are cookies pulled out of the header?
Global Timeout: Max number of seconds the whole call can take (only matters for parallel).
Async Connection: When Opening a connection, do not block.
Complex SSL Logic: Verify peer (Require verification of SSL certificate used) & use of local certificates (Certificate Authority file).
Send Files: The ability to "upload" a file to a server (shows up in $_FILES).
FTP Connection: Get a file off of a FTP server.
Full HTTP 1.1 compliance: Follow all requirements of a http client.
Auto Encode Array Data: http_build_query() used on data structures.
Alter streams mid execution: Example - Request 20 urls & break after at least 5 return.
Set Read/Write Chunk Size: I needed to adjust the write chunk size when sending a lot of data to an IIS server.
Persistent connections: Are connections reused between requests.
Streaming bodies: Can the entity body of a request or response be streamed or does it need to be loaded in its entirety into a string.
Some other things to consider is test code, documentation, and example use cases. These are a little bit harder to compare though. The reason why I have Parallel, Non Blocking, and Callbacks at the top is for building a multi-process library on top of HTTP requests. Without these 3, the power a multi-process library is significantly reduced. One more thing to consider is all of the GitHub projects require cURL or will eventually require it.
Comparison of all 5 GitHub projects & HTTPRL are as follows:
|Non Blocking Requests||Unknown||Unknown||Yes||Yes||Yes||Yes|
|Async Connection||No||No||No||Unknown||No||Half Yes|
|Complex SSL Logic||Half Yes||No||Yes||No||Yes||Yes|
|Send Files||Yes||No||Yes||No||Yes - with @ in CURL POST Options||No|
 - Blocks on DNS lookups
 - Can not set CURLOPT_CAINFO
 - Doesn't handle "100 Continue" correctly
 - Can be done by passing curl options directly in the request