What is the Apriori Algorithm?

BetterLife
3 min readApr 7, 2024

--

In the realm of data mining and machine learning, the Apriori algorithm stands as a fundamental tool for discovering frequent itemsets and deriving association rules from transactional databases. This algorithm, devised in 1994, has been instrumental in market basket analysis, recommendation systems, and customer behavior understanding.

Photo by Stephen Dawson on Unsplash

Introduction to Apriori Algorithm:

At its core, the Apriori algorithm aims to identify itemsets that appear together frequently within a dataset. Consider a supermarket transaction database where each transaction records the items purchased by a customer. Apriori helps in answering questions like: What items are commonly bought together? Are there any patterns in customer purchasing behavior?

How Apriori Works:

The algorithm employs a level-wise search strategy, iterating through increasing itemset sizes until no more frequent itemsets can be found. The key concept utilized is support, which measures the frequency of occurrence of an itemset within the dataset.

Here’s a step-by-step breakdown of the Apriori algorithm:

  1. Find frequent 1-itemsets: Scan the database to identify items with support above a predefined threshold.
  2. Generate candidate itemsets: Based on the frequent itemsets of the previous level, create candidate itemsets of larger size.
  3. Prune infrequent itemsets: Remove candidate itemsets that fail to meet the minimum support threshold.
  4. Repeat the process: Continue generating and pruning itemsets until no more frequent itemsets can be found.

Example:

Consider a transaction database with the following transactions:

Transaction 1: {bread, milk, eggs}

Transaction 2: {bread, butter}

Transaction 3: {milk, butter}

Transaction 4: {bread, milk}

Transaction 5: {bread, milk, butter}

Let’s assume a minimum support threshold of 2.

1. Find frequent 1-itemsets: {bread}, {milk}, {butter} (Support: 3 for each)

2. Generate candidate 2-itemsets: {bread, milk}, {bread, butter}, {milk, butter}

3. Prune infrequent itemsets: {bread, butter} and {milk, butter} (Support < 2)

4. No further candidate itemsets can be generated, so the algorithm terminates.

Comparison with Eclat Algorithm:

While the Apriori algorithm is based on the breadth-first search approach, the Eclat algorithm takes a depth-first search approach. Eclat stands for Equivalence Class Clustering and bottom-up Lattice Traversal. It uses a vertical data format where each transaction is represented as an itemset, and intersection operations are performed to find frequent itemsets.

Key Differences:

  1. Search Strategy: Apriori uses a breadth-first search, while Eclat employs a depth-first search.
  2. Data Structure: Apriori often utilizes a hash tree or bitmap representation, while Eclat uses a vertical data format.
  3. Performance: Eclat typically performs better on dense datasets, while Apriori might be more efficient on sparse datasets.
| Aspect             | Apriori Algorithm                                                 | Eclat Algorithm                                       |
|--------------------|------------------------------------------------------ |--------------------------------------------------------|
| Search Strategy | Breadth-first search | Depth-first search |
| Data Structure | Hash tree or bitmap | Vertical data format |
| Performance | Effective for sparse datasets; may suffer from scalability issues | More efficient, especially on dense datasets; better scalability |
| Number of Scans | Multiple scans of the dataset | Fewer scans compared to Apriori |
| Pruning | Prunes infrequent itemsets based on apriori property. | Utilizes intersection operations to avoid generating infrequent itemsets |

Conclusion:

The Apriori algorithm remains a cornerstone in the field of data mining, providing a systematic approach to discovering frequent itemsets and extracting meaningful association rules from transactional databases. While it has its limitations, particularly with large datasets, its simplicity and effectiveness make it a valuable tool for uncovering hidden patterns in diverse domains.

Useful Links:

  1. Eclat Algorithm — https://medium.com/p/fe07d33fcc5b
  2. Computational graphs VS Sequential layers in Neural Networks — https://medium.com/p/44467fd3296c

--

--