Term 2 and 3 Serra

Week 1

Serra introduced us to the new collaboration unit and she told us to have an idea about who we want to collaborate with and have some vague idea about what it would be regarding. We had two briefs from LCF (London College of Fashion) and UNESCO. We would get more information about the briefs in a meeting for next class. She told us we can also come up with our own briefs in case none of them appealed to us. Serra also told us about the submission where we had to submit a portfolio of the group work and video proof of the ongoing collaborative project.

We also had a really fun activity where Serra told us to fold a sheet of paper into three folds and then each of us had to draw a head, body and legs for different sheets of paper that classmates would pass around. The game was called Exquisite Corpse and it was super fun to see the end results! This also was the first time I was able to collaborate with my classmates.

Collab Unit Brief

We had three briefs presented to us. One was LCF, where we could collaborate with students to showcase something related to craft week. Then there was Numberfit, which was a teaching platform for students who needed help with math. They wanted students to help them out with Augmented Reality. The last brief was from UNESCO and they gave us a session on how they wanted an amazing idea that could tackle an issue related to conserving of the oceans and how less it is spoken about.

Week 2

Serra taught us about decentralization and how it is important for us especially in terms of workflow. The industry is shifting from centralized to decentralized workflow as it is more efficient. Collaboration as decentralized labor. More diverse and inclusive project workflow. 

She also introduced us to the concept of direction as there’s two types, creative and art. It refers to the guidance, Design is the perfection in technique which is creative direction. Art directors conceptualize the visual direction of a project and they are great researchers, while creative directors need to grasp the implications of design choices. We also answered what good art direction would have, which is clear communication of the project and readability. I understood that researching my audience is very important because they are going to see your work after all. I learnt a few new terms with regards to the Innovation adoption curve and Innovation Diffusion curve which involved innovators, early adopters, early majority, late majority and laggards. Empathy Maps also play a vital role as they show us how people think and feel, say and do, hear and see things.

Collab Unit Brief

We had another meeting for the collab unit briefing where we networked more with people this time. Our course leader Friendred helped us engage with people from other courses and I was finally able to find potential teammates to collaborate with. I decided to go ahead with the UNESCO brief as I felt that it was a great opportunity present to us. I found people from MA Animation, MA VFX and MA Music production to collaborate with. We decided to have a meeting next week to decide on how we wanted to proceed with the brief.

Week 3

Serra’s class was super interesting! I finally got to see a Motion Capture Suit for the first time! Serra opened a huge crate, which consisted of various parts of the body suit and she asked one of our classmates to volunteer to try it on. We ran into a few difficulties initially as the suit wasn’t getting paired/synced with the software and Serra was finding it hard to calibrate the suit. But once the sensors finally calibrated properly, we had a trial animation of the rig and we imported it into Unreal Engine. Inside UE, we were able to use MetaHuman to integrate the rig and check out the animation on our character. Later, Serra told us how to try retargeting the rig in case we needed to clean up our animation later. We imported a character from Mixamo and then tried to integrate the Mocap rig’s software to rename the bones of the skeletal rig and change the hierarchy a bit so that it matches.

Collab Unit Brief

We finally had our first meeting this week as a team and it was great to interact with everyone. We firmly decided that we wanted to work on the UNESCO brief and we wanted the output to be a 30 second short film. Our idea was to show the audiences what lies in the depths of the ocean and all the unexplored deep sea that not many people know about.

Week 4

I missed Serra’s class as well as I was unwell, but I got time to research on my collab unit and I made progress on the same.

Collab Unit Brief

We finally had our first meeting this week as a team and it was great to interact with everyone. We firmly decided that we wanted to work on the UNESCO brief and we wanted the output to be a 30 second short film. Our idea was to show the audiences what lies in the depths of the ocean and all the unexplored deep sea that not many people know about.

Week 5

In Serra’s class she sat with each one of us and went through the progress of everyone’s collaborative unit. I showed her my brief that I was working with my team, related to the UNESCO brief. She liked our team’s idea about the deep sea world, but she gave me feedback to incorporate the time aspect as well into it, to make the story more captivating. She also told me that UNESCO is expecting a lot of research oriented work, so I would need to get back with my teammates and tell them to study more about the ocean depths. With whatever time was left, Serra introduced us to the concept of photogrammetry. It was something new. I learnt how we could 3D scan an object and then get it into our 3D space. It involved taking different photos or videos at different angles of the object and then importing that data into software for it to generate a 3D replica with textures and materials. The more complex the object is, the more reference data the software would need to generate a good output.

Collab Unit Brief

We finally had our first meeting this week as a team and it was great to interact with everyone. We firmly decided that we wanted to work on the UNESCO brief and we wanted the output to be a 30 second short film. Our idea was to show the audiences what lies in the depths of the ocean and all the unexplored deep sea that not many people know about.

Week 6

It was finally field trip day and we were able to visit Tate Modern Museum. It was a really fun visit and each of us had to take three 3D scans of any three objects that were on display in the museum. We had to roam around and find proper displays that had 360 degree access, so that we could get all views of the object. We had to take a video and go around the object for about thirty seconds. I found three objects that I could get a good video off and stored it in my phone.

Collab Unit

We finally made progress on the roles and who would do what in our team. I would be in charge of creating the underwater environments majorly in Blender, Chay and Mahadev would take care of the initial above the ocean environment in Unreal Engine, whereas Oni and Charbel would take care of the fish animations that we were showing in the scene.

Week 7

I was really looking forward to Serra’s class as we would finally be able to incorporate our videos from Tate Modern and turn them into 3D models by using the software in class. I got to know that the software’s name was Reality Capture and we could only use the desktop systems for the licensed version. Once all of us were able to open the software, we were able to import any 360 video of our choice that we had taken. We could see a lot of cameras after this finished, and after rotating the object to make it look like it was resting in the right axes, we could finally generate a 3D mesh for the same. After that, Serra showed us various ways of how to clean up the mesh and spot irregularities. There were mainly four ways to go about it namely, smoothening, hole filling, mesh simplification and lasso tool filtration. Smoothening basically meant making the surface smoother, whereas, Mesh simplification would lower the polycount of a very dense mesh. Lasso tool filtration basically helps us select portions or faces of the mesh we don’t have, and then delete/hide them. Once all this was done and we were happy with the mesh, we could generate textures and within minutes we could export the entire model with textures as an FBX or anything format we chose. I was able to model an oats container and it came out decently well.

Collab Unit

We finally made progress on the roles and who would do what in our team. I attended the UNESCO feedback session as well on behalf of our team and I got good feedback from the speaker. She told me that we just had to do more research regarding the flora and fauna in our scene and to focus on the organisms that are found in the depths of the oceans instead of anything pollution related.

Week 8

Serra went through our collab unit briefs to see if we needed help with anything, because the deadline for submission was fast approaching. She saw my team’s work and was happy with progress as compared to what she had seen earlier. But, she still told me to work on the environment a bit more to blend with the fish and surroundings, especially the deep sea stuff. She also told me to work on the camera movement and the speed of the underwater bubbles, which I had created in Blender using the particle system modifier.

Collab Unit

I worked hard on the brief because there were barely 2 weeks left to finish the video. I added more environments into my scene which involved seaweeds and huge rock formations. I also sat with one of my teammates to get feedback on how to properly layout the scene because this is something that I had never done before. I was also able to animate the shark a bit better by adding smoother movements using animation fundamentals.

Week 9

This class was mainly preparing us for the showcase as we needed to tell both Friendred and Serra what equipment we’ll be needing to setup our videos/animations/installments. Since our team only had a video to showcase, we just needed a basic desktop setup with earphones, for audiences to see our short film. It was nice to see other’s videos, especially others who were doing the UNESCO brief along with me. Although the brief was open ended, it was awesome to see how other teams were very clear with what message they wanted to put out there.

Collab Unit

The main thing this week was to record our process of working on the UNESCO brief. So, my entire team decided to meet in person and we gathered together in one of the animation rooms. We started to see how everything looked together, especially the sync between when the camera was above water and then the transition to it being inside water. We tried to improve the animations of the fish as well as they were flying across the screen.

Week 10

I was finally ready for the collab unit submission for the UNESCO brief and I think it came out pretty decent. My teammates worked to compile everything together and we had a few last minute tweaks to do before the showcase. I needed to render out the entire underwater sequence with a transparent background so that the VFX team could use it as a layer and overlay the fishes that they needed to add. I also needed to create a poster for submission, so that had to be done as well.

Week 11

I was curious to know what Serra was going to teach us this term as this was the first class for Term 2. For starters, the class got divided into half, wherein half got sent to the motion capture room and the rest stayed behind for Serra to talk to them regarding their artefact video. I was part of the mocap group and it was really fun! Kay, the technician there, is a super charismatic individual and he explained to all of us about the Vicon motion capture suit. There were a lot of procedures involved, especially related to calibration, but the entire process was seamless. Our classmate volunteered to try the suit on and he did a few combat animations for us to see. I got to know the entire setup process. I learnt that there were two types of mocap, Intertial which works on the principle of intertia and Optical that works with optical sensors. We were using the Optical type.

Here is what I understood the pipeline to be:

There were about 16 cameras in the room that could help see the person wearing the suit. The first step was to calibrate the suit. The more movement the cameras could see, the more it calibrates. To do this, a Vicon calibration active wand is used.

Mocap Calibration

Before starting anything, it was important to mask out anything shiny in the room as it may affect the cameras capturing the suit’s data.

Once the calibration is done, we had to place the wand on the floor (in the center) to set the plane/origin i.e. orientation in the software. We need to tell the software directly where the floor is. This is done using markers placed on the floor at each corner approximately.

Then we set the offset value for each of the markers so that they are perfectly on the floor. This is so that we level everything on the plane/line. Once this is done, we can set the plane and a new boundary volume is set.

Then the guy with the suit can enter the space within this volume. The suit has markers, and they are placed asymmetrically so that it differentiates left and right and the software picks up better movements. Then we check whether the camera can see all 57 sensors on the suit.

Mocap suit in A-pose

Then the guy has to be in A pose and stay in the pose for a while to check all that all the sensors get activated and that the right number is achieved. Then we do basic warm up exercises to calibrate the suit. And that’s it we’re ready to go ahead with the animation.

Week 12

As we got divided into two groups again, this time the other group went for the motion capture class and we stayed behind for Serra’s lecture. She introduced us to the unit and told us what all we’d be potentially getting into. She told us a bit about experimental animation and a few immersive experiences. She also talked to us about installations and projection mapping that other artists have used. In this class she explained about user experience and how our work matters to the audiences who view it. I got to know that for this term, we had to create an artefact video and I began ideating what I could do. I spoke to Serra about what we actually had to do and she told me that we needed to create a video or experience that had some kind of animation aspect to it. I wanted to either create a racing game on Unreal Engine or create a small funny shortfilm on Maya or Blender but was unsure of the idea.

I researched a bit about the racing game idea and I watched a few tutorials on You tube about creating a racing game in Unreal engine and I found a really good one. I didn’t find time this week as I was doing a lot of part-time work and I had just gotten a new job.

Racing Game Idea

Extra

This week we got to attend the most famous CG event in London called the Vertex! I was super amazed at all the talks that happened there and for me all this was a first time. I literally wanted to attend every single talk, but unfortunately there were different halls and the talks were happening simultaneously in each of those halls. I attended a few good talks like concept art, artists who showcased their work, good habits in general while working, a few short film showcases and panel discussions regarding upcoming trends like new software and the AI revolution. The highlight of the event had to be the talk from Pixar where I got to learn about Renderman and VR sculpting.

Week 13

In this class Serra had told us earlier to come in prepared with an Unreal Engine scene for us to work on something, so I found a cool city scene for this. She introduced us to the world of virtual production and she showed us a few cool videos of how this was used, especially in music videos and games. I understood that it was a form of making films using a bunch of techniques like AR, CGI, motion capture and other technologies. Serra said we needed iphones or ipads for us to try virtual production in Unreal Engine. Unfortunately, I had neither and even though I borrowed one from Sunny in the staff room, because the Iphone didn’t have cellular data, my Unreal project did not work and I ended up using my classmates iphone to check the experience. It was pretty cool as I could walk around the class holding the iphone in front of me. The more I walked the more I could see different parts of the scene. I could walk and explore whichever part of the scene looked interesting to me.

The app was called Live Link VCAM, which was a real-time virtual camera setup to use inside Unreal Engine. It basically allowed us to stream live video from the Iphone/Ipad directly into Unreal Engine. These videos could then be integrated with virtual scenes in real-time and used for virtual production if need be. I understood that a lot of filmmakers and content creators used this to visualize and capture shots in a cool manner, as they can see the virtual shot on top of the live video footage, like a sort of overlay, from the camera in real-time. There were a bunch of plugins that we needed to enable on Unreal Engine for this to work like Virtual Camera, Live Link and Apple AR Kit to name a few.

Week 14

This class was mainly about our thoughts on the second project proposal which was the artefact video that we had to come up with. All this time I actually hadn’t thought ahead of my racing idea, because honestly I was spending more time on George’s assignment because I was liking Body Mechanics a lot. Serra went through my idea and she eventually told me that I shouldn’t go ahead with my racing game, because although it involved Unreal Engine, it did not include any animation of any sort and I did agree with her. So, I had to come up with a new idea altogether and I had no idea what to do. I remembered that my back up plan was to make a funny short film and then it struck me that I wanted to do something related to LEGO. And I wanted to do something related to stop motion because I wanted to try it out in the collab unit back in Term 2, but did not get the chance to.

I asked Serra if it was a good idea to use these techniques to create a short film idea and she said I could go ahead. Now, I had a more concise plan of action. I wanted to do something related to LEGO and Stop motion and the best idea was to combine the two. I also wanted to explore both physical stop motion and 3D stop motion, so I told Serra what idea I had in mind. My inspiration was mainly the LEGO movie which I had watched when I was younger.

I wanted to create something related to Iron Man, where he is being played with as a toy, this would be physical stop motion and then he has a dream and that part would be animated in 3D She really liked it because I was merging both the real world and 3D.

Experimenting with LEGO

After going home that day, I spent hours experimenting and playing around with the LEGO toys I had, so that I could come up with an idea.

Week 15

We got introduced to Touch Designer again (we had a workshop for the same software with a cool guy called Pierre and he had walked us through the basics of the software) and this time it was with Sunny. She showed us the basics again and a few cool visuals. She took us through the entire software basically. I understood that it was entirely a node based system (similar to what I have used in Blender with shaders) and a programming software used in the creative industry. For me, the cool part was creating audio reactive visuals and using projection mapping to bring these visuals to life in a physical space. I also got introduced to the term “VJing” which actually was a term used to describe the visuals that popular DJ artists use while performing their track on stage. I was always mesmerized by looking at the screen because the music used to always sync with the visuals, and the way the beat kept changing was nice to see.

Although the session was fun, and I liked using Touch Designer, it didn’t interest me that much and I wanted to focus on my artefact video because we were already 5 weeks in and I was lagging behind. I started to research about how I would go about animating LEGO’s in 3D and I found this website called mecabricks. I played around with the workshop area to see what I could do.

Mecabricks Website

It was literally like an online 3D software solely for LEGO and I this made my job so much easier! I individually created the characters I needed attaching the heads, bodies, legs and arms together choosing the texture I needed for them respectively.

Character for the film

Then I started building the environment in the same workshop area. I wanted the dream to have a fight sequence in it, where Iron Man showed off his powers a bit. I took inspiration from the first Iron Man film for this, where Iron Man saves a bunch of civilians from terrorists in Afghanistan. So, I needed to construct a sort of abandoned village kind of area.

Reference and my Environment

Once I had all these assets, I imported them into Blender to start working on the entire thing. I found two really cool add-ons called mecaface and epicrigfig which I could use to create both the facial rig and the body rig respectively.

EpicRigFig and MecaFace on the character

It was a bit time taking to manually apply both the body rig and the facial rig to all the characters I had and then I had to append and place them individually in my scene which again took a lot of time because it kept glitching. But finally I was able to place everything and all I needed to do was start animating!

We also had our project 2 proposal presentation that we had to present in class in front of our course leader and I created a PPT with slides showcasing how I could potentially take the idea forward. Friendred then showed me a reference of a similar style LEGO video that a senior of mine had created and it was great to see that someone else had also done something related to LEGO!

Week 16

I missed Serra’s class because I wasn’t feeling well. But I heard from classmates that Serra told us about how to import mocap data into Unreal and use the file exactly how we wanted to. She also told us a bit about retargeting and how to properly import an FBX mocap file into UE5. I knew this was important and I regretted missing this class but I made a lot of progress on my Artefact submission. This week I wanted to start on the physical stop motion and I booked a Stop Motion Bay in the university.

Stop Motion Bay WIP

I setup my LEGO’s in the scene and it was super fun to play around with the lighting and camera settings! It was really cool to see these tiny LEGO characters come to life on screen. I asked a technician to help me with how to go about properly animating and that’s when I got introduced to Dragonframe. This was a software that could be used for anything physical stop motion related. It was pretty straight forward to use and it was a breeze to shoot my shot. Although it was time taking and I had to take a lot of retakes because either my hand would move too fast or one of the LEGO’s would slip and fall. This was the first time I was ever attempting stop motion! But finally I got the hang of it, and after hours of work I had few seconds of animation to show for it.

Stop Motion Setup

Because I had less time, I just tried to do whatever I had in my head for the shot and a friend who knew a bit of stop motion helped me out, because I realized that it was pretty hard to do just by myself as I had to keep clicking on the camera to take a shot and move my hand with the LEGO pieces at the same time.

My story was basically to show Iron Man being chosen as a toy to be played with. Then the human takes him into a mini fight sequence where he has to defeat the villain. And then finally the human puts Iron Man to bed and he goes to sleep. After this I needed to get back into 3D and start animating Iron Man where he wakes up from the dream.

Stop Motion Output

After I finished animating and I looked at the output, I was thrilled to see that it came out so well! I never expected it to look that way, considering this was the first time I tried out stop motion, that to something as intricate as LEGO’s! I was able to export mp4 files from the system and now all I had to do was merge this with the 3D animation that I was yet to start on.

Week 17

Serra mainly advised all of us regarding our projects. She didn’t teach anything as such. On coming to my project, I showed her the progress I had made and she was happy with my idea and what I had done.

Iron Man flying up
Iron Man’s POV Shot
Iron Man’s Mid-Air Shot

But she had a bit of feedback regarding the environment and a few of the animations. Basically I had animated LEGO Iron Man flying in the air and the camera was a POV shot from his eyes as he flies up into the sky. While he flies up, the city beneath him gets smaller and there is endless sky above. Serra’s feedback was to add more of the city background with trees and landscape majorly. And for the sky, she told me it was a bit to plain, and I could make it better by adding LEGO clouds and a horizon line of the land below. She also told me to slightly improve the animations of Iron Man flying into the screen as he was looking a bit too rigid, but then I explained to her that it is a LEGO character after all and there is only so much that I could do to push the rig to its limits. I will incorporate her feedback going into next week though.

Week 18

Serra cancelled the class for this week. But we have an extra class scheduled for next week. I made progress on my Artefact video as I started creating separate files for each scene. The flying up scene where he wakes up from the dream and the fight scene where Iron Man uses his powers. And in each scene I have set up separate camera angles. Again, I used a website called Mecabricks to help me with my scene setup, especially with environments and characters. I used the platform to create Iron man, a few civilians and a few enemies. I want to recreate a scene from the movie Iron Man, where the main character saves civilians from terrorists like I had mentioned earlier. So, I found the scene on You tube and used this as my reference for the dream sequence.

Fight Scene WIP

I started to animate the civilians and enemies and Iron Man’s entry too. I wanted to make it look epic, just like how it looked in the first film. I was basically trying to recreate the movie scene, but the way that I would picture it and with LEGO of course.

Fight Scene Shot

For my physical stop motion shot, I booked the stop motion room in college to record the last scene of my film where Iron Man gets up from bed and he’s being picked up by a human. I also needed to animate the human placing him on the bed first.

Final Stop Motion Shot

Week 19

Although I missed Serra’s class for this week, I got to know that she introduced the class to “VJing” through a software called Rezolume. It was pretty cool as she got a DJ setup kit to class and it was pretty cool to see it in person. I regret missing this class, however, Emily and Negin, caught me up on what happened. I continued to make progress with my Artefact video though. The more I got into it, the more invested I got into making it a really good film because it was actually exciting to watch it all come together.

Iron Man’s landing Shot
Enemies Shot
Fight Sequence Shot
Fight Sequence WIP

I was trying to animate Iron Man as best as I could, and I spent a lot of time on this, especially with the landing and the fight sequence because I wanted the audience to feel how fast paced the scene was. I also tried to play around with special effects with LEGO bricks. I wanted to showcase Iron Man’s powers by animating his arc reactor beam by using transparent LEGO parts and then scaling them down or hiding them in render preview to match the timing of the shot.

In the extra class it was just me and Emily who attended and no one else showed up. But because Emily was struggling with her Mocap setup in Unreal, Serra was helping her most of the time. She did see my progress and was impressed with the effects I had created for Iron Man’s fire and also the environment that I had created. She had a few more suggestions wherein the HDRI could be more animated and cartoony style instead of realistic and also about Iron Man’s inside helmet shot, she suggested that I could try and recreate JARVIS’s HUD inside his helmet. I also needed to work on the lighting for both the intro flying scene and the inside helmet shot.

Iron Man flying up
Iron Man’s POV Shot
Mid-Air Shot

A new thing that I added this week was to include a shot inside Iron Man’s helmet because why not! I tried to get a sense of how it would look inside Iron Man’s helmet and my take on it with respect to a LEGO figure.

Inside Helmet Shot
Rendered View

I needed to play around with the lighting to achieve a better effect and like Serra mentioned, I need to try to add a HUD effect like it is present in the movies inside Iron Man’s helmet like sort of a translucent screen of sorts. I needed to research how to go about doing this.

Week 20

As this was the last week before our submission deadline, I tried to do as much as I could to progress with my short film idea and hopefully get a decent output. I started to do a few trial renders to see how the lighting and environment looked.

Rendered Shot of Fight Sequence
Rendered Shot of Enemies and Civilians
Rendered Shot of Iron Man’s landing

Although I was happy with the positioning and animation of the characters in the scene, I was not too happy with the lighting and environment as it was not matching the abandoned village, war sort of look I was going for. I needed the sky to be more of an evening sort of sun set lighting and I needed the environment to look a bit war-torn. My laptop was not able to handle the files as I started adding more polygons to my scene, so I had to visit the Digital Space in my university to help with my rendering.

Rendering WIP

I asked one of the technicians there and they told me that I could leave it overnight for rendering. I made a few tweaks to both my scenes and I hope the outputs turn out the way I want them to! One major thing I need to do now was to get proper sound effects and audios and also music for finalizing my scene. So, while the rendering was going on at uni, I started to research how I would go about recording the dialogues, adding the sound effects and choosing the background music for my scene.

Final Thoughts and Reflections

For this term: I felt that the majority of my time spent this term went into my artefact submission, but apart from that, I learnt of lot of pretty cool stuff as well! For starters I got introduced to the whole world of motion capture, something I had only seen on film sets that I saw after post credits scenes in a movie or online. So, it was nice to see the working of the suit and how the cameras and sensors functioned. Touch Designer was also a core part of this term and I believe that I am fairly equipped to create a decent visual, and it was nice to learn a new software. Serra’s classes were fun too, especially the Live Link Vcam and it was fun to walk around class and look at the world I was walking in on the big screen. Although, I missed a few classes, I always caught up the next day with my classmates to find out what I missed. Once this term is over I would like to try out the Vicon suit myself and go through the entire process of exporting it into Unreal Engine because I have seen my classmates do it for their artefact submission and I think they’d be able to help me out. I also want to start a small personal project on Touch designer if I have the time to create audio reactive visuals for some of my favorite music.

For the Artefact: I had a pretty tricky start to my artefact submission as it changed midway through the term from a racing game idea to an entirely different thing which turned out to be something related to LEGO stop motion. But I’m really glad this happened and it could not have happened had it not been for Serra. I was super happy that I could finally work with stop motion and experiment with LEGO on top of that, something I used to play with as a kid. Bringing these LEGO toys to life brought tears to my eyes with joy, because I never thought I would do something like this. From ideating to bringing the idea to life in both 3D and in the real world was an experience I will never forget. I got to use the Stop Motion Bay at LCC which was pretty cool and I got to learn about the mecabricks website for the 3D side of things. I put a lot of research into how one should go about animating LEGO characters and I watched a lot of You tube tutorials to help me out. And after animating, it had to look good as well, so I went over a bunch of render settings and camera angles to get the right timing and spacing of each shot, because I was trying to animate this in 12 frames per second, stop motion style, instead of the usual 25 or 30 FPS. It was a treat to create the artefact video and I had a lot of fun along the way. I am inspired to create my own LEGO films now and who knows, maybe start my own channel! I know there are a lot of content creators out there, but I will try to make my work set them apart by creating good quality stuff. I am a huge fan of the Marvel Cinematic Universe, so one way to go about it would be to recreate famous movie scenes and animate them either in the real world or in 3D.

Here is a link to my artifact video:

https://drive.google.com/file/d/1unwUSZXVrhTejfw_vzs0B0ukgKKRuqpV/view?usp=sharing

Here is a link to my showreel video for the term:

https://drive.google.com/file/d/17JGL8SHRQNiV5OEzrmjlP3OO2APsbYKC/view?usp=sharing

Also here is my FMP proposal if required:

https://docs.google.com/presentation/d/11Z9ptKvlHmmulsZ3s9NK-GldgmpO5Y4N/edit?usp=drive_link