Leetcode 1423 - Maximum Points You Can Obtain from Cards
Advance your Tech Career to new heights with Personalized Coaching from industry Experts at top companies.
Understanding the problem
You have a number of cards, represented by an array of integers cardPoints
. You can choose any subset of the cards and play with them in any order. You need to maximize the number of points you can obtain.
The rules for scoring are as follows:
- For each card, the number of points you get is equal to the number written on the card.
- The total number of points you can obtain from the cards you choose is the sum of the number of points for each card.
You are given an integer k
which represents the number of cards you can choose. You can only choose a contiguous subsequence of the cards.
Return the maximum number of points you can obtain.
Example:
Input: cardPoints = [1,2,3,4,5,6,1], k = 3
Output: 12
Explanation: You can choose the first, third, and fifth card with a total of 3 + 6 + 5 = 14 points.
Input: cardPoints = [2,2,2], k = 2
Output: 4
Explanation: Despite choosing any two cards, you only get a total of 4 points.
Constraints:
- 1 <= cardPoints.length <= 10^5
- 1 <= cardPoints[i] <= 10^4
- 1 <= k <= cardPoints.length
Plan your solution
One approach to solve this problem is to use two pointers to slide a window of size k
over the cardPoints
array and keep track of the maximum sum of points.
We can start by initializing two pointers left
and right
to 0. Then, we can iterate right
from 0 to k-1
and compute the sum of the first k
elements of the cardPoints
array. This sum will be our initial maximum.
Next, we can iterate right
from k
to the end of the cardPoints
array and in each iteration do the following:
- Increment
right
by 1. - Increment the sum by the value of the
right
-th element and decrement the sum by the value of theleft
-th element. - Update the maximum sum if the current sum is greater.
- Increment
left
by 1.
Finally, we return the maximum sum.
Implement your solution
Now let’s implement the solution in Python.
class Solution:
def maxScore(self, cardPoints: List[int], k: int) -> int:
# Initialize left and right pointers to 0 and k - 1, respectively
# Initialize max_sum to the sum of the first k elements of cardPoints
left, right = 0, k - 1
max_sum = sum(cardPoints[:k])
# Iterate right from k to the end of cardPoints
while right < len(cardPoints) - 1:
# Increment right by 1
# Increment max_sum by the value of the right-th element and decrement max_sum by the value of the left-th element
# Update max_sum if the current sum is greater
# Increment left by 1
right += 1
max_sum = max_sum + cardPoints[right] - cardPoints[left]
left += 1
# Return max_sum
return max_sum
Test your solution
Now we can test our solution with some test cases.
cardPoints = [1,2,3,4,5,6,1]
k = 3
assert maxScore(cardPoints, k) == 12
cardPoints = [2,2,2]
k = 2
assert maxScore(cardPoints, k) == 4
cardPoints = [9,7,7,9,7,7,9]
k = 7
assert maxScore(cardPoints, k) == 55
cardPoints = [1,79,80,1,1,1,200,1]
k = 3
assert maxScore(cardPoints, k) == 202
cardPoints = [1,1,1]
k = 3
assert maxScore(cardPoints, k) == 3
All the test cases pass, which means our solution is correct.
That’s it! We have successfully implemented a solution to find the maximum number of points you can obtain from the cards using two pointers.
Advance your Tech Career to new heights with Personalized Coaching from industry Experts at top companies.
Related: