Articles

Unlocking the Power of Software Testing in ASP.NET: Enhancing Software Quality: xUnit & Coverlet — I

Software tests are crucial to guarantee the delivery of a good quality product, indentify bugs, additionally offer comfort in the maintenance phase, code refactoring or even when changing team members, in addition to other benefits listed in part 1. As mentioned in the part 1, we will create a project with its corresponding test project using xUnit and Coverlet. To bring the example to life, we will create a folder for the projects. Within the folder, we will have a class library and the test project. Code to be tested In the class library below, we will have a class that contains a method that takes a string as input and validates whether it is a number and if it has no more than 10 characters. Commands to create the class library project. Test with xUnit Next, we have the test project that uses xUnit, and upon creation, it already includes the Coverlet library. To confirm this, after creating the test project, if we open the file xUnitCoverletTest.csproj, we will find one of the PackageReference entries as coverlet.collector. The test class validates the result of the method execution. In our example, if the method receives a string with only numbers and no more than 10 characters, it should return true, otherwise, the result is false. Two approaches were used for testing, one that allows passing multiple inputs as parameters to test ([Theory]) and another without parameters ([Fact]). The test project must reference our class library to be able to test the methods available in the class library. The following command allows referencing the class library. To confirm if it’s correct, you can open the file xUnitCoverletTest.csproj and check if there is a ProjectReference tag.
Unlocking the Power of Software Testing in ASP.NET: Enhancing Software Quality: xUnit & Coverlet — I

Articles

Entity Framework Core

Microsoft has been making efforts to position ASP.NET Core as an excellent choice for software development. Several indicators demonstrate notable progress, including benchmarks available on their website. According to these benchmarks, ASP.NET Core exhibits superior performance compared to NodeJS and Java. It is capable of processing a higher number of requests per second than NodeJS and Java. Entity Framework Core is an open-source Object-Relational Mapper (O/RM) that receives contributions from both Microsoft and the community. It is a cross-platform version of the well-known Entity Framework. Entity Framework Core has evolved at a faster pace compared to its predecessor, Entity Framework. It introduces several new features that are not available in the previous version. This rapid evolution has allowed Entity Framework Core to provide enhanced functionality and improved performance. Entity Framework Core Entity Framework Core (EF Core) is a library developed and maintained by Microsoft. Unlike Entity Framework, EF Core is open source, and this approach enables other people and organizations to contribute to the project. This impacts a faster evolution of EF Core compared to its predecessor. The involvement of the community has played a crucial role in introducing new features, improvements, and bug fixes to EF Core. EF Core provides a straightforward and clean approach for ASP.NET Core applications to access and store data in databases. Entity Framework Core (EF Core) is a Object-Relational Mapper (O/RM) designed as a lightweight, open-source, cross-platform, and extensible version of Entity Framework. A object-relational mapping (ORM) reduce the need for manual SQL queries and database management tasks, making the data access layer more streamlined and developer-friend. EF Core includes several additional features, such as: DbContext pooling — allows for the reuse of context instances. Each context instance is configured with several internal services and objects required to perform its tasks, and reusing them can result in performance gains in high-performance scenarios. Alternate keys — allow the definition of columns with unique values, in addition to primary keys. Global query filters — are filters applied globally, which restrict the data returned by a query based on a condition. This feature simplifies the application code, reduces the risk of incorrect data access. Entity Framework Core and database management systems EF Core supports various database management systems such as Oracle, SQL Server/Azure SQL Database, SQLite, MySQL, PostgreSQL, Azure Cosmos DB and In-memory (for testing). Important concepts in applications with EF Core Applications that use EF Core employ certain concepts, such as: Database Context — It is a class that maps database objects and application objects, configurations, and data initialization. The Database Context acts as a bridge between the application and the database to retrieve and persist data. Model — It is the class that is mapped to a table for retrieving or persisting data. The model is a C# class with additional configurations. There are two approaches to configure the models: 1. Data Annotations — This approach involves using attributes directly in the model class to define mappings, relationships, and constraints. 2. Fluent API configurations — This approach involves using the fluent API methods to configure mappings, relationships, and constraints in a separate configuration class.

Clean Hexagonal Architecture: Practical Case

This article is an extension of the theoretical article Components of Enterprise Software, which discusses the combination of software design concepts, specifically Hexagonal Architecture and Clean Architecture. It is necessary to read the previous article to understand the concepts applied here and the intended goals. This time, we will focus primarily on practical issues, namely the implementation of the design presented earlier. Design Proposal The following diagram is a result of combining the teachings of Hexagonal and Clean architectures, and it provides a detailed representation of the intended design for our software. This diagram provides a general, simple, and direct identification of all the layers in the design and their responsibilities: Entities (or domain) — Contains the core business rules. Use cases (or application) — Contains orchestration rules or the main operations of the application. Interface adapters — Mediate the application’s interaction with the external world (the infrastructure). Infrastructure (Frameworks, drivers) — The external world, which can include applications, services, libraries, APIs, etc. Entities and use cases constitute the core of the application, the central component of the architecture (the hexagon), and are responsible for encapsulating all the business logic. The Hexagonal Perspective From the perspective of Hexagonal Architecture, the application should be able to: Support multiple technologies (databases, message brokers, email services, etc.). Be testable without having all the technologies defined. Be developed without having all the technologies defined.

Monolithic Modular Architecture: Modular Folder Organization

When you’re developing an Enterprise Application which will only be used by the company and its employees. You’re not building a global scale application, or you don’t have a case for scalability. You don’t need Micro-services and all the infrastructure complexities that comes with it. You still need to have good foundations in place for your application. And if the time comes you can break the pieces into separate services easily. This article aims to expand upon the proposal presented in “Components of Enterprise Software: Practical Case” concerning folder organization based on the concepts of Clean Architecture and Hexagonal Architecture. Here, I will propose a more elaborate folder structure than the one presented in the previous article. It should be noted that Clean and Hexagonal architectures do not pertain to folder organization, but rather to a logical abstraction of software components, their responsibilities, and their dependency relationships. Key Aspects to Consider The folder organization should align with software design concepts, with a crucial emphasis on application maintenance over the long term and the ability to evolve or scale with minimal effort. Separation of Concerns: The folder structure should reflect the principle of separation of concerns, grouping different components or layers of the system based on their responsibilities. Layered Architecture: Organize based on different layers, such as the presentation layer, business layer, and data layer. Functional Modules: Identify functional modules of the application and organize folders accordingly. Each module should have its folder with its components, such as controllers, services, models, and views. Code Reusability: The folder structure should promote code reusability. Identify common or utility components that can be shared across different parts of the application and place them in separate folders or modules. Scalability and Growth: Consider scalability and future project growth. The folder structure should be capable of accommodating new features, components, or layers without becoming excessively complex or confusing. It should facilitate the addition or modification of functionalities as the project evolves. Naming Conventions: The structure should follow clear and consistent naming conventions for folders and files, making code navigation and understanding the purpose of each folder or file easier. Configuration and Resources: Separate configuration files, static resources (such as images or style files), and other files unrelated to the source code into appropriate folders. Collaboration and Team Structure: Consider the development team’s structure and how they collaborate on the project. The folder structure should align with the team’s workflow and facilitate concurrent development, version control, and code review processes.

Reduce bugs, vulnerabilities, and code smells, among other issues in .NET core apps, with this tool.

In line with the previously addressed topics, the quality of software is a concern to be taken into account by companies and professionals in the field. Previously, we discussed testing, which is an excellent ally in the pursuit of quality. However, there are other aspects that we intend to emphasize in this article, such as secure codes and their complexities. Complexity for readability, as well as for incorporating new changes easily. You might be thinking, but code review can mitigate these impacts. The proposal is not to rely solely on developers, but to use a tool that analyzes the code and identifies issues related to the points mentioned above within minutes. As a bonus, it provides suggestions for correction. SonarQube It is a code analysis tool that assists developers in delivering quality, secure, consistent, reliable, and low-complexity code. It supports more than 30 programming languages and can be integrated into the continuous integration pipeline. Developed in the Java language in 2006. In terms of licensing, it has a free version and other paid versions. SonarQube allows you to identify: Security vulnerabilities; Bugs; Code duplication; Test coverage level; Code complexity that could hinder the maintenance process; It also allows creating new rules to ensure that certain standards are followed. Functioning The solution consists of two applications: a client that gathers data from the source code and its respective tests, and another server that processes the collected data and presents reports and correction suggestions. On the server, configurations can be made, and new rules can be created. Demonstration For demonstration purposes, the proposal is to install the Community Edition version. The server will be installed in a Docker container, along with a PostgreSQL database. To facilitate this, a Docker Compose file (docker-compose.yml) is provided, which allows configuring and launching multiple containers simultaneously. The file must be created with the name docker-compose.yml. The suggestion would be to create a folder, within the folder create the file, and execute the following commands.