Dachser

Dachser is one of the leading logistics and freight transport operators worldwide. The company offers its clients a wide range of logistics solutions through its own global transport network, including warehousing services and integrated IT solutions tailored to each business.

Team

The team is organized in Madrid, Spain, and Kempten and Berlin, Germany. In the future, the plan is to integrate colleagues from the new headquarters in Portugal into the team.

Approach

The approach has mostly involved using agile methodologies. However, there was also a need to create quick and efficient solutions for specific applications, so a more traditional (waterfall) management style was used to ensure a fast and effective response to the immediate needs of the project.

Delivery

Management of the creation of several projects from scratch: setting up the repository in Bitbucket, configuring the IntelliJ IDEA IDE, and performing tasks to complete each project organized in 2-week Sprints.

I have performed unit tests, integration tests, and end-to-end tests with Cypress.

Position

Software Developer

Website
Technologies
  • Java 17
  • Spring Boot 3
  • JMS
  • Angular 16
  • Azure
  • Bitbucket
  • SonarQube
  • Jenkins
  • Jira
  • Confluence

Projects

Contingency Azure Project

Implementation of an automated backup solution using Azure Storage Account to manage blob backups. An Azure Function was developed to act as a trigger, executing automatically each time a new blob file is uploaded to the root container. This function was designed to organize the blobs into specific folders, creating new folders when necessary and replacing old files if a blob already existed in the corresponding folder.

Additionally, the function incorporated a mechanism to verify the maximum number of blobs allowed in each folder, with this information stored in a table within the Storage Account. Each folder was associated with a branch, and the function checked if the number of blobs in the folder exceeded the allowed limit. If a specific limit was not found for the branch, a default value was applied, ensuring efficient and organized management of backup files.

OPC.UA Project

Server connection developed with Node.js to manage communication in the industrial environment. This development allows publishing and subscribing to messages between OPC UA machines, facilitating real-time data exchange.

The project included setting up a local OPC UA server and integrating it into Java projects with Maven. In addition to the methods for publishing and subscribing to data and configuring the necessary properties and credentials to establish secure communication, custom callbacks were also designed to process received messages, enhancing the monitoring and control of all processes.

Incoming Goods Service Project

Development of a solution to improve transparency and communication of the goods reception process for clients. Automated status updates were implemented and sent to clients via a web service whenever changes occurred in key fields.

The project involved configuring specific fields in a proprietary Java management system, which trigger automatic messages to clients and the logistics portal when status changes occur during the goods reception process, such as the actual arrival of the truck, creation of the goods receipt, and the completion of the storage process. This ensured that only relevant information was shared, avoiding the unwanted exposure of data.

With this approach, real-time status updates were sent to clients, improving visibility of the goods reception process without compromising the confidentiality of SSCC-level information, which was not desired by most clients.

SOR Project

This project involved managing JSON messages that are saved in the database only if the structure and required fields of a template in the database match the structure and content (required fields) of the received JSON file.

We had to create two controllers: one to manage the templates and another to manage the received JSON data. The templates are also JSON.

In this project, the H2 database and Postman were used for testing and verifying that both templates and JSON data were correctly saved in the database. Integration tests were also conducted without Mockito to ensure that queries to the different endpoints (which interacted with the H2 database) were functioning correctly.

TMS.WMS Project

Dachser Automation Project for automating high racks. This project connects with another project called DIP (Data Integration Platform) and sends (publishes) status updates while also subscribing to a topic (id/theme) to receive updates about the high racks' status. Kafka was initially considered, but we ultimately used Spring JMS.

Although the project was initially created to automate the high racks of warehouses, it was designed to be usable in other projects as well, with different types of information. There is no event queue: when a message was published, if no one was listening, the message would die and was not stored anywhere.

AHRL.JMS Project

The development of the component was based on a tool called DIP (Data Integration Platform) to manage the publishing and subscribing of messages through Java Message Service (JMS). This component enables sending and receiving messages in distributed systems using topics, metadata, and optional SQL92 format filters, ensuring that only relevant messages are processed for each subscriber.

The project included the creation of methods for publishing and subscribing to messages, with options to configure parameters flexibly according to the project's needs. The integration was carried out in Maven environments, where it was used as a dependency, allowing the necessary properties and credentials to be configured directly within the projects.