Iterating a list of lists within Python for aws lambda


Iterating a list of lists within Python for aws lambda
I have a question related to python.
The use case is to write a python function (in aws lambda) which will look for a set of files in multiple buckets and return some action like creating a dummy file in s3 bucket or triggering another lambda.
For eg:
list1=[file1,file2,file3]
list2=[file4,file5,file6]
list3=[f7,f8,f9]
def lambda_handler(event,context):
if len(list1)==9:
print("something")
//create dummy file in s3 OR, trigger another lambda
elif len(list2)==9:
print("Something")
else:
print("all files are not available")
and like wise.
I am a bit confused about how to do iteration within the 3lists and trigger one lambda for one set of file say list1 or list2 or list3. Or alternatively I can create a dummy file in s3.
Can anyone please help me with the way to do it?
Actually both. Need help in both. The lambda should check in 3buckets looking for files for which it should iterate through lists and then if file is available in any one bucket, it should trigger anather lambda.. If in all the three buckets files have arrived then one by one it should trigger the lambda..
– Muskan
15 hours ago
Sorry, I still don't fully understand the task. What is the end-state that you are actually trying to achieve? Does it know the files it is looking for, or is it comparing files between the buckets? Why do you wish to create "dummy files"? What does the "other lambda" then do? Feel free to edit your Question to provide more details.
– John Rotenstein
10 hours ago
Well.. Thank you so much for looking into it first of all. I am quite a beginner in Python.. So the usecase is, There will be 3 types of src file coming into 3 different folders, suppose 'mybucket' is the bucket name within that 3 sub buckets are created.. Each bucket carries different set of files(file comes with 8 partition in a particular file pattern) .Those files will arrive one after the other in each bucket.
– Muskan
13 mins ago
The lambda has to keep checking for the files within those 3 folders.If it gets all 8 partition files in any of the folder out of 3, it has to create a _success file or trigger anather lambda to process it (not worried about the 2nd lambda now).Reason behind creating a success file it to put sns trigger to run the 2nd lambda. So either success file is an option or directly pointing to anather lambda to trigger is an option(not sure if dtz possible).Its rare that at a time in all 3 folders all the 8 files will arrive.But if it does,it has to repeat the above action for all the three file types.
– Muskan
13 mins ago
1 Answer
1
I would recommend this architecture:
This way, things only happen when files are retrieved, rather than having to check every n minutes. Also, it will only be triggered on new files arriving, rather than having to overlook existing files that have already been processed or are awaiting other files.
The only potential danger is if all the desired files arrive in a short time space. Each file would trigger a separate lambda function and each of them might see that all files are available and then attempt to trigger the next process. So, be a little careful around that second trigger. You might need to include some logic to make sure they aren't processed twice.
By clicking "Post Your Answer", you acknowledge that you have read our updated terms of service, privacy policy and cookie policy, and that your continued use of the website is subject to these policies.
Could you please clarify your question? Are you asking how to write code to achieve a particular task, or are you asking how to trigger a Lambda function in a particular situation? (And what is the situation?)
– John Rotenstein
yesterday