Stylized Sequence Generation
In this this project, we present an explorative study of constrained natural language generation in the stylized domain of Shakespearean sonnets. Leveraging both classical probabilistic modeling and modern neural architectures, we evaluate Hidden Markov Models (HMMs) trained via the Baum-Welch algorithm alongside character-level Long Short-Term Memory (LSTM) networks. We show that while HMMs effectively encode poetic structure through probabilistic control over syllable count and rhyme constraints, LSTMs offer higher local coherence and creative flexibility at the cost of structural discipline. Our contributions include a hybrid rhyme enforcement algorithm based on phonetic dictionaries, a systematic analysis of meter and stress pattern violation, and a broad interpretation of generative limitations arising from hard-form constraints.