Skip to content
View Maystern's full-sized avatar
🎯
Focusing
🎯
Focusing

Highlights

  • Pro

Block or report Maystern

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Please don't include any personal information such as legal names or email addresses. Maximum 100 characters, markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse
Maystern/README.md

[CV] [Homepage] [Transcript]

Self Introduction

I am Jiacheng Luo (罗嘉诚), a junior student majoring in CSE , specializing in Computer Science and Technology at SUSTech. Currently, my academic advisor is Prof. Jianguo Zhang, and my life advisor is Assistant Prof. Bin Zhu.

  • Prof. Jianguo Zhang is the leader of the CVIP Group laboratory at SUSTech and has previously served as a Reader in the School of Science and Engineering at the University of Dundee, UK, as well as the Director of International Cooperation in the Department of Computer Science.
  • Prof. Bin Zhu is an assistant professor and doctoral supervisor of the SPHEM (School of Public Health and Emergency Management) in SUSTech.

The main research areas of the CVIP Group laboratory are computer vision, medical image and information processing, machine learning, and artificial intelligence.

My research interests include Domain Adaptation, Transfer Learning, Parameter-Efficient Fine-Tuning and Large Model Training.

Contact Me

Academic Background

  • Sep. 2021 - Jun. 2025 (expected): Southern University of Science and Technology (BEng.)

Research Interests

Domain Adaptation
    Domain adaptation is a field associated with machine learning and transfer learning. This scenario arises when we aim at learning a model from a source data distribution and applying that model on a different (but related) target data distribution. For instance, one of the tasks of the common spam filtering problem consists in adapting a model from one user (the source distribution) to a new user who receives significantly different emails (the target distribution). Domain adaptation has also been shown to be beneficial for learning unrelated sources. Note that, when more than one source distribution is available the problem is referred to as multi-source domain adaptation.
Transfer Learning
    Transfer learning is a technique in machine learning in which knowledge learned from a task is re-used in order to boost performance on a related task. For example, for image classification, knowledge gained while learning to recognize cars could be applied when trying to recognize trucks. This topic is related to the psychological literature on transfer of learning, although practical ties between the two fields are limited. Reusing/transferring information from previously learned tasks to new tasks has the potential to significantly improve learning efficiency.
Parameter-Efficient Fine-Tuning
    Parameter-efficient Fine-tuning (PEFT) is a technique used in Natural Language Processing (NLP) to improve the performance of pre-trained language models on specific downstream tasks. It involves reusing the pre-trained model’s parameters and fine-tuning them on a smaller dataset, which saves computational resources and time compared to training the entire model from scratch. PEFT achieves this efficiency by freezing some of the layers of the pre-trained model and only fine-tuning the last few layers that are specific to the downstream task. This way, the model can be adapted to new tasks with less computational overhead and fewer labeled examples. Although PEFT has been a relatively novel concept, updating the last layer of models has been in practice in the field of computer vision since the introduction of transfer learning. Even in NLP, experiments with static and non-static word embeddings were carried out early on. Parameter-efficient fine-tuning aims to improve the performance of pre-trained models, such as BERT and RoBERTa, on various downstream tasks, including sentiment analysis, named entity recognition, and question-answering. It achieves this in low-resource settings with limited data and computational resources. It modifies only a small subset of model parameters and is less prone to overfitting.
Large Model Training
    Large model training involves the process of training machine learning or deep learning models that possess a significant number of parameters or exhibit complex architectures. It necessitates substantial computational resources, such as GPUs or TPUs, along with extensive datasets for effective training. Employing optimization algorithms like stochastic gradient descent (SGD) or its variants, large model training iteratively fine-tunes model parameters to optimize performance. Techniques like mini-batch training, regularization, and learning rate scheduling are often employed to enhance convergence and mitigate overfitting. This approach finds widespread application in fields like natural language processing, computer vision, and reinforcement learning, where intricate data patterns require sophisticated models for effective analysis and prediction.

News and Updates

  • [Feb 02,2024] One co-authored paper has been submitted to ICML 2024 for consideration.
  • [Aug 31,2023] My personal academic website is online.
  • [Jul 18,2023] Honored to join the CVIP Group as a formal member and hope to do a good job!
  • [Aug 22,2022] Happy to join the CVIP Group as an unofficial attending student!

Pinned Loading

  1. SUSTech_CS205_Cpp_Projects SUSTech_CS205_Cpp_Projects Public

    All five projects of the C++course of SUSTech

    C++ 2

  2. stackoverflow-web-application stackoverflow-web-application Public

    Forked from XiaoLeGG/stackoverflow-web-application

    A stackoverflow web application practise which is the project of course CS209A Spring in SUSTech.

    Java 2

  3. WheelRadiusPointCloud WheelRadiusPointCloud Public

    Calculation Method, Device, Medium, and Equipment for Grinding Wheel Fillet Radius

    Python

  4. SUSTech_CS102A_JavaA_OthellooO SUSTech_CS102A_JavaA_OthellooO Public

    Java 1