Back to the classroom, master's degree mode
- March 12, 2026
I had the first class of my master’s degree. It is funny how some sensations come back very fast. The university campus, the different rhythm of conversations, the professor organizing ideas in the board projector, students from completely different areas sharing the same classroom. After three MBAs and the graduation I started back in 2007, entering again in a classroom has a certain taste of nostalgia.
But at the same time the feeling is also very different. In the time of my first classes in university the internet was still consolidating itself as basic infrastructure of society. Today the discussion goes around artificial intelligence, gigantic data networks and models capable of interpreting human language.
I am doing my master’s at the Graduate Program in Engineering and Knowledge Management at UFSC. The program studies how knowledge is created, structured and shared when people, technology and organizations meet. For me this connects a lot with what I already do at work. My daily routine is very much leading innovation initiatives and development of digital products. In some way, the master’s became a way to look with more method and depth to themes that already exist in my daily life like artificial intelligence, information networks and digital transformation.
This beginning is happening in the course Social Network Analysis and Applied Artificial Intelligence. The idea is to understand how digital methods and artificial intelligence can be used to analyze data and networks in different contexts. Not only as tools to produce content, but as methodological instruments to analyze large volumes of information.
Very early in the class appeared an interesting question: what is a unit of information? It can be a text, an image, a video, a tweet, a comment, a profile. Basically any element that can be transformed into data and analyzed computationally.
The goal of the course is not to create programmers or machine learning specialists. The focus is more in understanding the concepts and learning how to use tools that allow collecting data, organizing information and extracting patterns.
In many moments the class passed through graph theory, which is a mathematical base to understand networks. A graph is a simple structure composed by nodes and edges. Nodes represent entities and edges represent relations between them.
When we think about digital social networks this becomes very intuitive. People are nodes, interactions are connections. Likes, comments, mentions and shares become metrics inside this network.
The interesting part is that this structure allows us to see patterns that would be invisible if we look at the data individually. A graph can reveal who are the central actors, which groups are connected between themselves and how information circulates inside a system.
Professor Rita showed a very interesting research example done during the pandemic. She analyzed the communication of the Health Ministries of Canada and Brazil on Twitter during forty days. From the data collection it was possible to build graphs that showed how information was spreading and which profiles were more influential in that network.
In this kind of visualization something curious appears. A big circle in the center represents the profile with the highest centrality. In this case, the Ministry of Health itself. Around it appear smaller nodes representing users interacting with that account. Thicker lines indicate more frequent connections.
Another part of the class discussed digital methods as a set of tools to analyze communication phenomena. This involves everything from data collection to visualization and interpretation. Four important elements were highlighted for any research using these approaches.
First, understanding the context of the phenomenon being studied. Without that any analysis risks interpreting data outside of reality. Then the available infrastructure. Not every platform allows easy access to data. Some require paid APIs or restrict completely the data collection.
The third point is the minimum technical knowledge. It is not necessary to be a programmer, but understanding basic concepts of tools and languages helps a lot. And finally the interpretative logic. Data by itself does not say too much. Analysis requires critical reading and theoretical understanding.
Professor Alexandre went a bit deeper into the functioning of modern artificial intelligence. He explained how language models transform words into numerical vectors and how this allows comparing semantic contexts.
When we type a question in a model like ChatGPT, it does not “understand” the sentence the same way humans understand. It transforms that text into mathematical representations and searches patterns inside a gigantic network of relations learned during training. These vector representations allow measuring similarity between contents. It is the same logic used in search engines, recommendation systems and many data analysis tools.
While I was in the class I kept thinking how this kind of approach connects directly with what I do at work. In the world of technology and digital products we are constantly dealing with data, patterns and human behavior mediated by platforms. It would be great if the corporate world had more time to observe these phenomena with more attention.
At the end of the class I had that good feeling of the beginning of a journey. The feeling that there is a big territory to explore, especially related to communication methodologies and research areas that are a bit outside of my purely technical work in management or IT.
Communication today happens inside gigantic digital systems. Platforms, algorithms, data, networks. Understanding these structures maybe is one of the most interesting ways to study contemporary society.
Stay hungry, stay foolish.