SoFunction
Updated on 2024-10-29

Five Ways to Remove Duplicate List Items in Python

This article lists several ways to remove duplicates that may exist in a Python list, which is a requirement encountered in many applications, and it is a good idea for programmers to know a few of these methods in order to be able to write effective programs when they are used.

Method 1: The Plain Method

This approach is based on traversing the entire list and adding the first occurrence to the new list.

Sample code:

# Python 3 code to demonstrate 
# removing duplicated from list 
# using naive methods 
  
# initializing list
test_list = [1, 3, 5, 6, 3, 5, 6, 1]
print ("The original list is : " +  str(test_list))
  
# using naive method
# to remove duplicated 
# from list 
res = []
for i in test_list:
    if i not in res:
        (i)
  
# printing list after removal 
print ("The list after removing duplicates : " + str(res))

→ Output results:
The original list is : [1, 3, 5, 6, 3, 5, 6, 1]
The list after removing duplicates : [1, 3, 5, 6]

Method 2: List the analytic formula

This approach is actually a simplified version of the first approach, which utilizes list parsing and uses a single line of code as an alternative to the looping approach above.

Sample code:

# Python 3 code to demonstrate 
# removing duplicated from list 
# using list comprehension
  
# initializing list
test_list = [1, 3, 5, 6, 3, 5, 6, 1]
print ("The original list is : " +  str(test_list))
  
# using list comprehension
# to remove duplicated 
# from list 
res = []
[(x) for x in test_list if x not in res]
  
# printing list after removal 
print ("The list after removing duplicates : " + str(res))

→ Output results:
The original list is : [1, 3, 5, 6, 3, 5, 6, 1]
The list after removing duplicates : [1, 3, 5, 6]

Method 3: Using set()

This is the most popular way to remove duplicate elements from a list. However, one of the biggest drawbacks of this method is that after using it, the order of the elements in the list will not continue to be consistent with the original.

Sample code:

# Python 3 code to demonstrate 
# removing duplicated from list 
# using set()
  
# initializing list
test_list = [1, 5, 3, 6, 3, 5, 6, 1]
print ("The original list is : " +  str(test_list))
  
# using set()
# to remove duplicated 
# from list 
test_list = list(set(test_list))
  
# printing list after removal 
# distorted ordering
print ("The list after removing duplicates : " + str(test_list))

→ Output results:
The original list is : [1, 5, 3, 6, 3, 5, 6, 1]
The list after removing duplicates : [1, 3, 5, 6]

Method 4: Using list parser + enumerate()

This method uses enumeration on top of list parsing to remove duplicate elements. Elements are skipped by checking if they already exist in the list. This method keeps the order of the elements in the list unchanged.

Sample code:

# Python 3 code to demonstrate 
# removing duplicated from list 
# using list comprehension + enumerate()
  
# initializing list
test_list = [1, 5, 3, 6, 3, 5, 6, 1]
print ("The original list is : " +  str(test_list))
  
# using list comprehension + enumerate()
# to remove duplicated 
# from list 
res = [i for n, i in enumerate(test_list) if i not in test_list[:n]]
  
# printing list after removal 
print ("The list after removing duplicates : " + str(res))

→ Output results:
The original list is : [1, 5, 3, 6, 3, 5, 6, 1]
The list after removing duplicates : [1, 5, 3, 6]

Method 5: Utilization of ()

This is the fastest way to accomplish special tasks. It first removes duplicate items from the list and returns a dictionary, which is finally converted to a list. This method works for strings as well.

Sample code:

# Python 3 code to demonstrate 
# removing duplicated from list 
# using ()
from collections import OrderedDict
  
# initializing list
test_list = [1, 5, 3, 6, 3, 5, 6, 1]
print ("The original list is : " +  str(test_list))
  
# using ()
# to remove duplicated 
# from list 
res = list((test_list))
  
# printing list after removal 
print ("The list after removing duplicates : " + str(res))

→ Output results:
The original list is : [1, 5, 3, 6, 3, 5, 6, 1]
The list after removing duplicates : [1, 5, 3, 6]

Method 6: Handling duplicate elements in nested lists

For duplicate element removal in multidimensional lists (nested lists). Here it is assumed that the elements in the list (which is also a list) which have the same elements (but not necessarily in the same order) are treated as duplicates. Then the following set() + sorted() methods are used to accomplish the task.

Sample code:

# Python3 code to demonstrate
# removing duplicate sublist 
# using set() + sorted()
  
# initializing list
test_list = [[1, 0, -1], [-1, 0, 1], [-1, 0, 1],
                           [1, 2, 3], [3, 4, 1]]
  
# printing original list
print("The original list : " + str(test_list))
  
# using set() + sorted()
# removing duplicate sublist
res = list(set(tuple(sorted(sub)) for sub in test_list))
  
# print result
print("The list after duplicate removal : " + str(res)) 

→ Output results:
The original list : [[1, 0, -1], [-1, 0, 1], [-1, 0, 1], [1, 2, 3], [3, 4, 1]]
The list after duplicate removal : [(-1, 0, 1), (1, 3, 4), (1, 2, 3)]

You can also utilize set() + map() + sorted()

Sample code:

# Python3 code to demonstrate
# removing duplicate sublist 
# using set() + map() + sorted()
  
# initializing list
test_list = [[1, 0, -1], [-1, 0, 1], [-1, 0, 1],
                           [1, 2, 3], [3, 4, 1]]
  
# printing original list
print("The original list : " + str(test_list))
  
# using set() + map() + sorted()
# removing duplicate sublist
res = list(set(map(lambda i: tuple(sorted(i)), test_list)))
  
# print result
print("The list after duplicate removal : " + str(res))

→ Output results:
The original list : [[1, 0, -1], [-1, 0, 1], [-1, 0, 1], [1, 2, 3], [3, 4, 1]]
The list after duplicate removal : [(-1, 0, 1), (1, 3, 4), (1, 2, 3)]

to this article on Python to remove the List of duplicate items of the five methods of the article is introduced to this, more related Python to remove the List of duplicate items content, please search for my previous articles or continue to browse the following related articles I hope you will support me in the future more!