Dining Concierge

Intro

This project is a full-stack, serverless Dining Concierge Chatbot built as part of CS-GY 9223: Cloud Computing & Big Data at New York University. The system demonstrates an end-to-end cloud-native architecture that delivers restaurant recommendations through natural language interaction using fully managed AWS services. The application uses Amazon Lex to handle conversational flow and intent management, with multiple AWS Lambda functions orchestrating request handling, validation, asynchronous processing, and notification delivery. A REST API exposed via API Gateway enables frontend interaction through an autogenerated JavaScript SDK, while a static web client is hosted on Amazon S3. Restaurant data is sourced from the Yelp API and stored in DynamoDB, with OpenSearch Serverless used to index cuisines and restaurant identifiers for efficient, randomized recommendation retrieval. To ensure reliability and fault tolerance, Amazon SQS is used for asynchronous message processing, paired with a Dead-Letter Queue (DLQ) for failure handling. Recommendation results are delivered to users via email using Amazon SES, and system behavior is monitored through CloudWatch Logs. The project emphasizes serverless design principles, including scalability, loose coupling, resilience, and minimal operational overhead. It also explores practical tradeoffs in cloud eventing by transitioning from EventBridge-based invocation to an SQS-driven model to achieve stronger retry semantics and DLQ support.

Next work

Create a free website with Framer, the website builder loved by startups, designers and agencies.