Skip to content

Commit 6b8377f

Browse files
erictleungRandommood
authored andcommitted
Fix machine learning paper link and spelling (papers-we-love#419)
1 parent aa4d2c6 commit 6b8377f

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

machine_learning/README.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -5,15 +5,15 @@
55
* [Top 10 algorithms in data mining](http://www.cs.uvm.edu/~icdm/algorithms/10Algorithms-08.pdf) - While it is difficult to identify the top 10, this paper contains 10 very important data mining/machine learning algorithms
66
* [A Few Useful Things to Know about Machine Learning](http://homes.cs.washington.edu/~pedrod/papers/cacm12.pdf) - Just like the title says, it contains many useful tips and gotchas for machine learning
77
* [Random Forests](https://www.stat.berkeley.edu/~breiman/randomforest2001.pdf) - The initial paper on random forests
8-
* [Conditional Random Fields: Probabilistic Models for Segmenting and Labeling Sequence Data](http://repository.upenn.edu/cgi/viewcontent.cgi?article=1162&context=cis_papers) - The paper introducing conditional random fields as a framework for building probablistic models.
8+
* [Conditional Random Fields: Probabilistic Models for Segmenting and Labeling Sequence Data](http://repository.upenn.edu/cgi/viewcontent.cgi?article=1162&context=cis_papers) - The paper introducing conditional random fields as a framework for building probabilistic models.
99
* [Support-Vector Networks](http://rd.springer.com/content/pdf/10.1007%2FBF00994018.pdf) - The initial paper on support-vector networks for classification.
1010
* [The Fast Johnson-Lindenstrauss Transforms](https://www.cs.princeton.edu/~chazelle/pubs/FJLT-sicomp09.pdf)
1111

1212
The Johnson-Lindenstrauss transform (JLT) prescribes that there exists a matrix of size `k x d`, where `k = O(1/eps^2 log d)` such that with high probability, a matrix A drawn from this distribution preserves pairwise distances up to epsilon (e.g. `(1-eps) * ||x-y|| < ||Ax - Ay|| < (1+eps) ||x-y||`). This paper was the first paper to show that you can actually compute the JLT in less that `O(kd)` operations (e.g. you don't need to do the full matrix multiplication). They used their faster algorithm to construct one of the fastest known approximate nearest neighbor algorithms.
1313

1414
*Ailon, Nir, and Bernard Chazelle. "The fast Johnson-Lindenstrauss transform and approximate nearest neighbors." SIAM Journal on Computing 39.1 (2009): 302-322. Available: https://www.cs.princeton.edu/~chazelle/pubs/FJLT-sicomp09.pdf*
1515

16-
* [Applications of Machine Learning to Location Data](ftp://ftp.cs.princeton.edu/techreports/2013/949.pdf) - Using machine learning to design and analyze novel algorithms that leverage location data.
16+
* [Applications of Machine Learning to Location Data](http://www.berkkapicioglu.com/wp-content/uploads/2013/11/thesis_final.pdf) - Using machine learning to design and analyze novel algorithms that leverage location data.
1717

1818
## Hosted Papers
1919

0 commit comments

Comments
 (0)