Witline.Core.Api 1.0.16

Table of Content

:fu:

Initialize a new Project From Template

  • Step1: Clone Template Project. [[Link](https://gitlab.com/witline/witline-masstransit/witline-base-template-v2]

  • Step2: delete .git folder. This is a hidden folder. Change folder setting to view hidden files. git folder

  • Step3: Download & Install gitBash for windows

  • Step4: Create/Go to folder where you want to clone git repository. Right click and select gitbash. This will allow git bash command to be run from the current directory

gitbash from context menu or gitbash from windows startup

  • Step5: Run the following command in gitBash console
./renameObjects.sh Tamplet yourPorjectname(like UiManagement)

Execute Command

  • Step6: Than rename foldername(template) to your projectname(like uimanagement) manually

  • Step7: remove .vs folder from inside src folder Delete .vs folder

  • Step8: Copy your entities into projectname.Domian and common into common folder

  • Step:9 All entities whether (main or common) must inherit from BaseEntity.

  • Step10: Register all your entites in ApplicationContext in #region Project DbSets Register Entities

    • You will find some entities are already set. Those are common/shared entites. Remove entities that are not required by your project.
  • Step11: add manage usersecret key. Add usersecrets to API and Core.Migration projects.

    • Create a user serect file by right clicking on Template.Api project User Secret File
    • Copy application settings from appsettings.json to user secret file. Once you move settings to user secrets then delete it from appsettings.json.

    Note Copy Settings from where you find the label Move below settings to User Secret File till End User Secret and then delete it from appsettings.

    /***************************************************************************************************************************/
    /** Move below settings to User Secret File*/
    /***************************************************************************************************************************/
    
    "SecurityGuard": true, //Enable/Disable centeralized authorization. true or null means security guard is enabled.
    "OdataGuard": false, //Enable/Disable centeralized odata filteration. true or null means odata guard is enabled.
    
    "JWT_API_KEY": "Sample Test Key Delete the First Node Delete Operation - Singly Linked List",
    "JWT_ISSUER": "Witline_Server",
    "JWT_AUDIENCE": "Witline_Client",
    
    //-------------------------------------------------
    // Server Public/Private Key
    //-------------------------------------------------
    "ServerPublicKey": "",
    "ServerPrivateKey": "",
    
    
    //-------------------------------------------------
    // Database and other services settings
    //-------------------------------------------------
    "POSTGRES_HOST": "localhost",
    "POSTGRES_PORT": "5432",
    //-------------------------------------------------
    // Local System
    //-------------------------------------------------
    
    //"PG_MICROSERVICE_DB_NAME": "Template",
    //"PG_MICROSERVICE_DB_USER": "postgres",
    //"PG_MICROSERVICE_DB_PASS": "pgadmin",
    
    //"PG_AUDIT_DB_NAME": "auditdb",
    //"PG_AUDIT_DB_USERNAME": "postgres",
    //"PG_AUDIT_DB_PASSWORD": "pgadmin",
    
    //"RABBITMQ_HOST": "localhost",
    //"RABBITMQ_DEFAULT_USER": "guest",
    //"RABBITMQ_DEFAULT_PASS": "guest",
    
    //"MONGODB_HOST": "localhost",
    //"MONGODB_PORT": 27017,
    //"MONGO_APP_USERNAME": "",
    //"MONGO_APP_PASSWORD": "",
    //-------------------------------------------------
    //Developer System
    //-------------------------------------------------
    "PG_MICROSERVICE_DB_NAME": "Template",
    "PG_MICROSERVICE_DB_USER": "postgres",
    "PG_MICROSERVICE_DB_PASS": "", /* Copy settings from existing project*/
    
    "PG_AUDIT_DB_NAME": "auditdb",
    "PG_AUDIT_DB_USERNAME": "postgres",
    "PG_AUDIT_DB_PASSWORD": "", /* Copy settings from existing project*/
    
    "RABBITMQ_HOST": "localhost",
    "RABBITMQ_DEFAULT_USER": "rabbitadmin",
    "RABBITMQ_DEFAULT_PASS": "", /* Copy settings from existing project*/
    
    "MONGODB_HOST": "localhost",
    "MONGODB_PORT": 27017,
    "MONGO_APP_USERNAME": "mongoadmin",
    "MONGO_APP_PASSWORD": "", /* Copy settings from existing project*/
    
    
    "TWILIO_ACCOUNT_ID": "", /* Copy settings from existing project*/
    "TWILIO_AUTH_TOKEN": "", /* Copy settings from existing project*/
    
    "SENDGRID_API_KEY": "", /* Copy settings from existing project*/
    
    "STRIPE_SECRET_KEY": "", /* Copy settings from existing project*/
    "STRIPE_PUBLISHABLEKEY": "", /* Copy settings from existing project*/
    
    "SYSTEM_ADMIN_EMAIL": "vebra_uk@hotmail.com",
    "SYSTEM_ADMIN_USERNAME": "vebra_uk",
    "SYSTEM_ADMIN_PASSWORD": "", /* Copy settings from existing project*/
    "SYSTEM_ADMIN_PHONE_NUMBER": "07476241925",
    
    "GOOGLE_STORAGE_BUCKET_NAME": "test-app-storage.gcp-buckets.witline.co.uk",
    "GOOGLE_STORAGE_SA_KEY_FILE": "", /* Copy settings from existing project*/
    "GOOGLE_APPLICATION_CREDENTIALS": "", /* Copy settings from existing project*/
    "DOTNET_RUNNING_IN_CONTAINER": false
    /***************************************************************************************************************************/
    /* End User Secret */
    /***************************************************************************************************************************/
    
    • Read the settings carefully. Update user name and password based on shared data or you can copy settings from previous/existing projects.

    You must add ServerPublicKey and ServerPrivateKeys other-wise the project will not run and It will throw an exception.

    • Once setting are coppied to user secret file. Double click YourProject.Api project and copy the UserSecretsId. This setting must be coppied to Core.Migration Project. You must also copy this key to ApplicationContext as shown in the below picture. Copy User Secret ApplicationContext User secret

By following above steps the developer will be able to run the template project successfully.

Template Project Structure

  • Unit Test projects in the Test folder
  • Core.Migration
  • Shared
  • Template.Api
  • Template.Business
  • Template.Component
  • Template.Domain

Project Structure

Core.Migration

As the name suggest it is an empty project. It is used for the EF core migrations. The devloper must add user secrets for the database name, host, username, password, port and update program class with the user secret id. :fu:

Shared

It is a shared project. Shared functionality must be added in this project.

API Project

It is an Asp.net WebApi project. All of the controller classes should be inherited from BaseController. Most services are already registered in the startup class. Any new service must be registered in the startup class. I will give startup class detials later

Business Project

The project has business related functionality. The following classes should be placed in this project

  • All the database context (EntityFramework Core context, Mongo db context)
  • Different business repossitories (Repository Pattern)
  • Services layer. All the services that are used by API project should be added in this project Business Project Struct

Component Project

The project is used for MassTransit (MT) micro services. All the MT contracts, and servieses are added to this project. Component Project Struct

Domain Project

This project is used for our database entities. There may be different types of entities. which should be inherited from our BaseEntity.

Installing Docker images

Download Zip file from here where all docker images yaml files are located

  1. (Required Step) Before pulling RabbitMQ, Postgres or MongoDB, create a network in docker using below command:
docker network create services-network

If you don't follow above network creation step you will experience following error when pulling docker images. Error: network services-network declared as external, but could not be found

Now Pull required docker images by going into each folder and do the following:

  1. Open terminal in folder where docker-compose.yaml file is located
  2. Enter 'docker-compose up -d'

Docker image will download, config and start the container itself :)


Note: Ignore this section as it is not needed now. We have now updated postgres and other images in above section. Install Postgis and TimescaleDb extensions for Postgres

  1. Clone docker image using command docker pull timescale/timescaledb-postgis:latest-pg12
  2. If Postgres is already installed. Stop its container in docker.
  3. Then run the following command which starts recent pulled docker image with postgres as a single image
docker run -d --name postgres -e POSTGRES_PASSWORD=pgadmin -e POSTGRES_USER=postgres -v ${HOME}/postgres-data/:/var/lib/postgresql/data -p 5432:5432 timescale/timescaledb-postgis:latest-pg12

Rules

The following rules should be followed by the developers.

Coding/Patterns/Methodology Sytle

C# coding styles should be used.

  • Fields/private/readonly fields names should be started with underscore (snake format) (optional)
  • Property/Function/Class Name be in Pascal format
  • Interfaces must be prefixed with I followed by pascal format.
  • Add proper comments to public properties/functions
  • Add addition comments to the complex logic. Explain what is happening
  • Application Settings: must be saved in the user secrets file. Our project rarely use appaettings.js due to security reasons.
  • Base Classes/Interfaces: Our Template uses a variety of Base Classes in different projects. It is important to use it because it gives us basic functionality. Let's see what those basic classes are:
    • BaseEntity: All of our database entities should be inherited from BaseEntity. It has a primary key along with some important properties.
    • IMainEntity: Documentation..
    • IEntity: Documentation..
    • IBaseDocument: Documentation..

:fu: NodaTime

NOTE: NodaTime is removed from our project tempate. Due to compatibility issues with OData

The developer should not use vanilla .net DateTime. DateTime is replaced with the NodaTime date & time objects.

We are using IClockService, which has the basic date/time related functionality. The developer should inject IClockService in their class constructor. It will allow developer to use Date/Time related functions. A glance on the IClockServie interface

public interface IClockService
{    
    DateTimeZone TimeZone { get; }
    Instant Now { get; }
    LocalDateTime LocalNow { get; }
    Instant ToInstant(LocalDateTime local);
    LocalDateTime ToLocal(Instant instant);
}

Use IClockServie.Now instead of DateTime.Now. IClockServie.Now will return UTC Date and Time. For local date and time use IClockServie.LocalNow.

Why we use NodaTime. Please read the following blog Post.

A better date and time API for .NET. Noda Time is an alternative date and time API for.NET. It helps you to think about your data more clearly, and express operations on that data more precisely. Everything is already setup in the project. User should use NodaTime date & time objects. The following are NodaTime core types

  • Instant
  • Offset
  • CalendarSystem
  • LocalDateTime
  • LocalDate
  • LocalTime
  • OffsetDateTime
  • OffsetDate and OffsetTime
  • DateTimeZone
  • ZonedDateTime
  • Duration
  • Period
  • IClock
  • Interval

The details documentation can be viewed here

Date and time arithmetic

There are two types of arithmetic in Noda Time: arithmetic on the time line (in some sense "absolute" arithmetic), and calendrical arithmetic. Noda Time deliberately separates the two, with the aim of avoiding subtle bugs where developers may be tempted to mix concepts inappropriately.

Duration duration = Duration.FromMinutes(3);
Instant now = SystemClock.Instance.GetCurrentInstant();
Instant future = now + duration; // Or now.Plus(duration)

ZonedDateTime nowInIsoUtc = now.InUtc();
ZonedDateTime thenInIsoUtc = nowInIsoUtc + duration;

Complete documentation can be followed here

NodatTime and PostgreSQL

PostgreSQL recommand NodaTime data type mapping with PostgreSQL datatypes. The integration is already done in th project. The NodaTime functions mapping is already mapped by PostgreSQL Noda Time provider.

The detail documentation can be followed here

TimescaleDB

There is a dedicated TimescaleDB repository. The repository has a complete sample example of how to use features of the TimescaleDB extension. The project is using timescale DB. The user can view the detailed documentation of TimescaleDB from the following link.

The following will show how the TimescaleDB integrated with the User management project. Later on, this project will be our base template for all of our projects.

The TimescaleDB integration can be found in the following C# classes

TimescaleDB Integration

  • Program.cs
  • ApplicationContext.cs Location: Template.Business > DataAccess
  • Template.Domain
  • The majority of the details are wrapped in the Witeline.Common Nuget package.
  • Postgres Function Types

Let's dig into the details.

Program.cs

if (context.Database.GetPendingMigrations().Any())
    context.Database.Migrate();
context.TransformToHypertable();
context.AddUserDefinedFunctions();

This code context.TransformToHypertable() will convert Entities that are marked as Hypertable to TimescaleDB hyper tables. The next line of code 'context.AddUserDefinedFunctions()` will add user-defined functions to PostgreSQL database. There are three types of functions used in our project. I will give you the details later.

ApplicationContext.cs

/* Below is the coded how to set indexer */
modelBuilder.HasPostgresExtension("TimescaleDB");
//modelBuilder.Entity<Ride>()
//    .HasNoKey()
//    .HasIndex(i => new { i.VendorId, i.PickupTime })
//    .HasSortOrder(Npgsql.EntityFrameworkCore.PostgreSQL.Metadata.SortOrder.Ascending, Npgsql.EntityFrameworkCore.PostgreSQL.Metadata.SortOrder.Descending);
//modelBuilder.Entity<Ride>()
//    .HasIndex(i => new { i.PickupTime, i.VendorId })
//    .HasSortOrder(Npgsql.EntityFrameworkCore.PostgreSQL.Metadata.SortOrder.Descending, SortOrder.Ascending);
//modelBuilder.Entity<Ride>()
//   .HasIndex(i => new { i.RateCode, i.PickupTime })
//   .HasSortOrder(SortOrder.Ascending, SortOrder.Descending);
//modelBuilder.Entity<Ride>()
//  .HasIndex(i => new { i.PassengerCount, i.PickupTime })
//  .HasSortOrder(SortOrder.Ascending, SortOrder.Descending);
modelBuilder.AddPostgresUserDefinedFunctions();

These are the settings that will set TimescaleDB Entities. The user must install a TimescaleDB extension on top of the PostgreSQL database.

This modelBuilder.HasPostgresExtension("TimescaleDB"); will install timescale extension on current database.

Note: User must install the timescale extension before using with entity framework. The above code enables and adds timescale necessary tables.

Next you will see commented code. This comment code is used to add indexers on entities. Last line modelBuilder.AddPostgresUserDefinedFunctions() will add functions that we will use in linq queries

Template.Domain

I have added Sample Entity called TimescaleEntity. The entity is decorated with Hypertable attribute. It will automatically convert the regular table to a hypertable.

[Hypertable(nameof(DateOfJoining))]
public class TimescaleEntity:BaseEntity
{
    public DateTime DateOfJoining { get; set; }
    public string Address { get; set; }
}

For learning purposes, I borrowed an Entity called Ride from the timescale sample. Let's see what we need to convert an entity to a hyper table.

[Table("rides")]
[Hypertable("pickup_datetime", "vendor_id", 2)]
public class Ride    
{      
     
    [Column("vendor_id")]
    public string VendorId { get; set; }
    [Column("pickup_datetime", TypeName = "timestamp")]
    public DateTime PickupTime { get; set; }
    [Column("dropoff_datetime", TypeName = "timestamp")]
    public DateTime DropOffTime { get; set; }
    [Column("passenger_count")]
    public decimal? PassengerCount { get; set; }
    [Column("trip_distance")]
    public decimal? TripDistance { get; set; }
    [Column("pickup_longitude")]
    public decimal? PickupLongitude { get; set; }
    [Column("pickup_latitude")]
    public decimal? PickupLatitude { get; set; }
    [Column("rate_code")]
    public int? RateCode { get; set; }
    [Column("dropoff_longitude")]
    public decimal? DropOffLongitude { get; set; }
    [Column("dropoff_latitude")]
    public decimal? DropOffLatitude { get; set; }
    [Column("payment_type")]
    public int? PaymentTypeId { get; set; }        
    [Column("fare_amount")]
    public decimal? FareAmount { get; set; }
    [Column("extra")]
    public decimal? Extra { get; set; }
    [Column("mta_tax")]
    public decimal? MTATax { get; set; }
    [Column("tip_amount")]
    public decimal? TipAmount { get; set; }
    [Column("tolls_amount")]
    public decimal? TollsAmount { get; set; }
    [Column("improvement_surcharge")]
    public decimal? ImprovementSurcharge { get; set; }
    [Column("total_amount")]
    public decimal? TotalAmount { get; set; }
    //[Column("pickup_geom", TypeName = "geometry(Point,2163)")]
    public Point PickupGeom { get; set; }
    //[Column("dropoff_geom", TypeName = "geometry(Point,2163)")]
    public Point DropoffGeom { get; set; }
}

The entity looks scary, don't worry; we need only one line of code. This is the magic line of code [Hypertable("pickup_datetime", "vendor_id", 2)]. The attribute will convert the entity to a hypertable.

Ride entity Hypertable attribute uses two parameters. The Time Column name parameter is required; in our case, pickup_datetime is the column containing time values and the primary column to partition by. The following parameter is optional. The 2nd parameter is used for an additional column to partition by.

There are more parameters that the user can set in the Hypertable attribute. The detailed documentation can be viewed directly in the code or check official documentation or source code

How to use HyperTable

Everything integrated into the project. The developer needs two things. One mark entities with Hypertable attribute. For example

[Hypertable(nameof(OrderDate))]
public class TestEntity
{
    public Guid Id { get; set; }
    public DateTime OrderDate { get; set; }
}

2nd if a developer needs a function that doesn't exist, he must translate that function.

Postgres Function Types

The documentation Function Types is related to the PostgreSQL database, as our project uses PostgreSQL as the primary database, but these steps can be applied to any database.

There are three types of functions, that we use with the entity framework

  1. PostgreSQL functions that are translated into Entity framework
  2. PostgreSQL functions that are not translated/mapped into Entity framework
  3. Functions that are not present in PostgreSQL. Those are called user-defined functions

The functions that are already translated/mapped by the PostgreSQL provider. Those functions can be used directly in Linq queries using the class EF.Functions. PostgreSQL provides a healthy list of functions provided by PostgreSQL.

A list of Postgis NetTopologySuite functions that can be directly utilized in LINQ queries

There are occasions where Postgres provide functions for a specific purpose but is not mapped/translated to Entity Framework. Those need to be translated into entity framework. I have translated some functions that, user can review in Witline.Common project.

The third type is the functions created by the user, which are not present in PostgreSQL. Those may be made for a specific purpose. For example, in Mssqlserver, there is datediff function which gives you the difference b/w two dates. There is no alternative function in PostgreSQL. I have created one for the same purpose. For such functions, there are two ways.

First, create a function in the PostgreSQL database and translate it just like we do for regular functions. This approach has one drawback in case a new database is created, the user must manually import that function to the newly created database.

The 2nd way is to embed the function script in the code. The code will post that script on the PostgreSQL database. The developer will be responsible for the mapping and translation. I have created the datediff function in this way. Here is the script. The script will create a datediff function.

private const string DateDiffQuery = @"CREATE OR REPLACE FUNCTION public.datediff(
                                           ""to"" timestamp without time zone,
                                           ""from"" timestamp without time zone)
                                                RETURNS numeric
                                                LANGUAGE 'plpgsql'
                                                COST 100
                                                VOLATILE PARALLEL UNSAFE
                                            AS $BODY$
                                                declare res numeric;
                                                begin

                                                res := (DATE_PART('day', ""to"" - ""from"") * 24 +
                                                  DATE_PART('hour', ""to"" - ""from"")) * 60 +
                                                  DATE_PART('minute', ""to"" - ""from"");
                                                return res;
                                                end;
                                            $BODY$;";

Below functions will execute above script and add datediff function in PostgreSql database

private static bool AddFunctionPg(DbContext context, PostgresFunctionEnum functionName)
{
    using var command = context.Database.GetDbConnection().CreateCommand();
    command.CommandText = PostgresFunctionsQuery.GetFunctionQuery(functionName);
    command.CommandType = System.Data.CommandType.Text;
    command.Connection.Open();
    var val = command.ExecuteNonQuery();
    command.Connection.Close();
    return val > 0;
}
/// <summary>
/// This will create a new  user deinfed datediff function in postgres database.
/// <remarks> The function <see cref="SqlFunctions.DateDiff"/> can be used in LINQ query</remarks>
/// </summary>
/// <param name="context"></param>
public static void AddUserDefinedFunctions(this DbContext context)
{
    AddFunctionPg(context, PostgresFunctionEnum.DateDiff);
}

The datediff function translation is the same as for regular function translation.

Functions Mapped to Entity Framework

The following functions are translated to Entity Framework. The developer can use those functions directly in LINQ queries.

public static DateTime DateTrunc(string field, DateTime source)
public static decimal DateDiff(DateTime to, DateTime from)
public static double Extract(string field, DateTime source)
public static decimal Trunc(decimal number)
public static decimal Trunc(decimal number, int precision)

The user can find the detailed implementation of those functions in these classes

Linq Queries

There are no queries related to TimescaleDB. I borrowed queries from the Timescaledb sample project. Below are different LINQ queries that depict how we can utilize TimescaleDB functions

public IQueryable GetDailyRides(DateTime? startDate, DateTime? endDate)
{
    var ctx = (ApplicationContext)_context;
    var qry =
        from r in ctx.Rides
        select new { Day = SqlFunctions.DateTrunc("day", r.PickupTime), Ride = r };
    #region Filter
    if (endDate.HasValue && startDate.HasValue)
        qry = qry.Where(f => f.Ride.PickupTime >= startDate.Value && f.Ride.PickupTime <= endDate.Value);
    else if (endDate.HasValue && !startDate.HasValue)
    {
        qry = qry.Where(f => f.Ride.PickupTime <= endDate.Value);
    }
    else if (startDate.HasValue && !endDate.HasValue)
    {
        qry = qry.Where(f => f.Ride.PickupTime >= startDate.Value);
    }
    else
    {
        endDate = new DateTime(2016, 01, 08);
        qry = qry.Where(f => f.Ride.PickupTime <= endDate.Value);
    }
    #endregion
    return
        from d in qry
        group d by d.Day into g
        select new { Day = g.Key, Count = g.Count() };
}
public IQueryable GetDailyFarAmount(int passangerCount = 1, DateTime? startDate = null, DateTime? endDate = null)
{
    var ctx = (ApplicationContext)_context;
    var qry =
        from r in ctx.Rides
        where r.PassengerCount == passangerCount
        select new { Day = SqlFunctions.DateTrunc("day", r.PickupTime), Ride = r };
    #region Filter
    if (endDate.HasValue && startDate.HasValue)
        qry = qry.Where(f => f.Ride.PickupTime >= startDate.Value && f.Ride.PickupTime <= endDate.Value);
    else if (endDate.HasValue && !startDate.HasValue)
    {
        qry = qry.Where(f => f.Ride.PickupTime <= endDate.Value);

    }
    else if (startDate.HasValue && !endDate.HasValue)
    {
        qry = qry.Where(f => f.Ride.PickupTime >= startDate.Value);
    }
    else
    {
        endDate = new DateTime(2016, 02, 08);
        qry = qry.Where(f => f.Ride.PickupTime < endDate.Value);
    }
    #endregion
    var gquery =
        from d in qry
        group d by d.Day into g
        select new { Day = g.Key, Count = g.Count(), AverageFare = g.Average(a => a.Ride.FareAmount) };
    return gquery;
}
public IQueryable GetRidesByRateType(int?[] rateTypes = null, DateTime? startDate = null, DateTime? endDate = null, bool showRateCodeDescriptions = false)
{
    var ctx = (ApplicationContext)_context;
    var qry =
        from r in ctx.Rides
        select new { Ride = r };
    #region Filter
    //if(showRateCodeDescriptions)
    //{
    //    qry = qry.Join(ctx.Rates, k => k.RateCode, ki => ki.RateCode, (r, rate) => new { Ride = r, Rate = rate });
    //}
    if (endDate.HasValue && startDate.HasValue)
        qry = qry.Where(f => f.Ride.PickupTime >= startDate.Value && f.Ride.PickupTime <= endDate.Value);
    else if (endDate.HasValue && !startDate.HasValue)
    {
        qry = qry.Where(f => f.Ride.PickupTime <= endDate.Value);

    }
    else if (startDate.HasValue && !endDate.HasValue)
    {
        qry = qry.Where(f => f.Ride.PickupTime >= startDate.Value);
    }
    else
    {
        endDate = new DateTime(2016, 02, 08);
        qry = qry.Where(f => f.Ride.PickupTime < endDate.Value);
    }
    if (rateTypes != null && rateTypes.Length > 0)
    {
        qry = qry.Where(f => rateTypes.Contains(f.Ride.RateCode));
    }
    #endregion
    if (showRateCodeDescriptions)
    {
        var gjquery =
                from d in qry
                join rate in ctx.Rates on d.Ride.RateCode equals rate.RateCode
                group new { d, rate } by new { d.Ride.RateCode, rate.Decription } into g
                select new { RateCode = g.Key.RateCode, RateDescription = g.Key.Decription, Count = g.Count() };
        return gjquery;
    }
    var gquery =
        from d in qry
        group d by d.Ride.RateCode into g
        select new { RateCode = g.Key, Count = g.Count() };
    return gquery;
}
public IQueryable GetAvgTripDetails(int?[] rateTypes = null, DateTime? startDate = null, DateTime? endDate = null, bool showRateCodeDescriptions = false)
{
    var ctx = (ApplicationContext)_context;

    var qry =
        from r in ctx.Rides
        select new { Ride = r };
    #region Filter
    if (rateTypes == null || rateTypes.Length == 0)
    {
        rateTypes = new int?[] { 2, 3 }; /**/
        qry = qry.Where(f => rateTypes.Contains(f.Ride.RateCode));
    }
    if (endDate.HasValue && startDate.HasValue)
        qry = qry.Where(f => f.Ride.PickupTime >= startDate.Value && f.Ride.PickupTime <= endDate.Value);
    else if (endDate.HasValue && !startDate.HasValue)
    {
        qry = qry.Where(f => f.Ride.PickupTime <= endDate.Value);

    }
    else if (startDate.HasValue && !endDate.HasValue)
    {
        qry = qry.Where(f => f.Ride.PickupTime >= startDate.Value);
    }
    else
    {
        endDate = new DateTime(2016, 02, 01);
        qry = qry.Where(f => f.Ride.PickupTime < endDate.Value);
    }
    if (rateTypes != null && rateTypes.Length > 0)
    {
        //qry = qry.Where(f => rateTypes.Contains(f.Ride.RateCode));
    }
    #endregion

    var gjquery =
            from d in qry
            join rate in ctx.Rates on d.Ride.RateCode equals rate.RateCode
            group new { d.Ride, rate } by rate.Decription into g
            select new
            {
                RateCode = g.Key,
                NumberOfTrips = g.Count(),
                AvgTripDuration = g.Average(a => SqlFunctions.DateDiff(a.Ride.DropOffTime, a.Ride.PickupTime)),
                AvgAmount = g.Average(a => a.Ride.TotalAmount),
                AvgTipAmount = g.Average(a => a.Ride.TipAmount),
                MinimumDistance = g.Min(a => a.Ride.TripDistance),
                AvgDistance = g.Average(a => a.Ride.TripDistance),
                MaxDistance = g.Max(a => a.Ride.TripDistance),
                AvgPassanger = g.Average(a => a.Ride.PassengerCount)
            };
    return gjquery;
}
        public IQueryable GetEvery5MinuteRides(DateTime? startDate = null, DateTime? endDate = null)
        {
            var ctx = (ApplicationContext)_context;
            endDate = new DateTime(2016, 01, 02);
            var qry =
                from r in ctx.Rides
                where r.PickupTime < endDate.Value
                select new
                {
                    Hours = SqlFunctions.Extract("hour", r.PickupTime),
                    FiveMinutes = SqlFunctions.Trunc((decimal)SqlFunctions.Extract("minute", r.PickupTime) / 5) * 5,
                    Ride = r
                };

            #region Filter

            if (endDate.HasValue && startDate.HasValue)
                qry = qry.Where(f => f.Ride.PickupTime >= startDate.Value && f.Ride.PickupTime <= endDate.Value);
            else if (endDate.HasValue && !startDate.HasValue)
            {
                qry = qry.Where(f => f.Ride.PickupTime <= endDate.Value);

            }
            else if (startDate.HasValue && !endDate.HasValue)
            {
                qry = qry.Where(f => f.Ride.PickupTime >= startDate.Value);
            }
            else
            {
                endDate = new DateTime(2016, 01, 02);
                qry = qry.Where(f => f.Ride.PickupTime < endDate.Value);
            }
            #endregion
            var gjquery =
                    from d in qry
                    join rate in ctx.Rates on d.Ride.RateCode equals rate.RateCode
                    group new { d.Ride, rate } by new { d.Hours, d.FiveMinutes } into g
                    select new
                    {
                        g.Key.Hours,
                        g.Key.FiveMinutes,
                        Rides = g.Count()
                    };
            return gjquery;
        }
        #endregion

Timescaledb Annextures

User Registration flow by using swagger

  1. /api/User/register

Regiser user by entering all the required fields

Example:

{
 "Id": "3fa85f64-5717-4562-b3fc-2c963f66afa6",
 "IsActive": true,
 "LoginEmail": "qamar3210@gmail.com",
 "TryJsonB": "string",
 "UserName": "qamar3210",
 "Password": "YORU Password",
 "FirstName": "Qamar",
 "MiddleName": "",
 "LastName": "Abbas",
 "LoginMobile": "0333-9642432",
 "MACAddress": "string",
 "CustodianId": "3fa85f64-5717-4562-b3fc-2c963f66afa6",
 "Roles": [
 "string" //Get role id from Roles table (postgres)
 ]
}
  1. /api/User/verifymobile

Verify Mobile number

User needs verification Code (token). Verification code can be found in users table in Mongodb (localhost:8081). Look for the required record and get MobileVerificationToken

Example:

 {
  "LoginMobile": "0333-9642432",
  "LoginEmail": "qamar3210@gmail.com",
  "Token": "2342",   // MobileVerificationToken
  "Password": "string"  //Keep Password un-touched
 }
  1. /api/User/verifyemail/

Verify Email

User needs email verification (token). Email verification token can be found in users table ofMongodb (localhost:8081). Look for the required record and get EmailVerificationToken Example: Just pass the EmailVerificationToken token 4. /api/Device/register

Finally Device needs to be registered

Example:

 {
  "Id": "3fa85f64-5717-4562-b3fc-2c963f66afa6",
  "IsActive": true,
  "MACAddress": "mac of your device" //Mac of your device or any random string. This string will be used in user login process
 }

User registeration is completed

User login by using swagger

api/User/login

Fill all the fields

 UserName:
 Password:
 MACAddress:  //Same string which is used during device registration

New asp.net middle-wares (SecurityGuard, ODataGuard)

  • SecurityGuard

  • ODataGuard

SecurityGuard

Two new entities were added for SecurityGuard. *AppGuard

  • AppGuardClaims

These entities holds informations regarding controller, actions and decorated claims attribute (ClaimType, ClaimValue) data.

SecurityGuard will use these entites data to dynamically authorize and authenticate Api requests. In case of unauthorized requests, the SecurityGuard will shortcircuit middle-ware and terminates current request. The SecurityGuard will flawlessly work along with other autorization.

In case of custom authorization no need to add information of controller/action into the database. The SecurityGuard middle-ware will forward request to custom or default authroization filter

New custom logic can be added in SecurityGuard to further enhance the capabilities of SecurityGuard

ODataGuard

ODataGuard is used to filter OData queries ($select, $filter, $expand, $orderby, $apply, $count, $skip). This will prevent query forgery.

Eample: A user have permissions to view a user Entity with the following fields, [UserName, FirstName, Password]. A forged $select query could be [LoginName, UserName, FirstName, Password], so the ODataGuard will filter $select and remove LoginName from $select query

The current version of ODataGuard will remove $expand and does not filter $filter and $orderby, those query attibutes must be handled.

How to Use Neo Blockchain

Neo blockchain requires Neo.Foundation nuget package. We will use Neo Blockchain in projects where it is required. Not all project requires and will use Neo Blockchain.

To implement neo blockchain in your api project. Follow these steps

To setup neo-cli private chain for a project. User must copy neo-cli folder to the project directory

Neo-cli Configuration

  1. Delete the following folders ApplicationLogs_* ConsensusState Data_LevelDB_* Logs Nep17BalanceData

  2. open config file set wallet UnlockWallet > IsActive: false

    {
      "ApplicationConfiguration": {
        "Logger": {
          "Path": "Logs",
          "ConsoleOutput": true,
          "Active": true
        },
        "Storage": {
          "Engine": "LevelDBStore",
          "Path": "Data_LevelDB_{0}"
        },
        "P2P": {
          "Port": 60331,
          "WsPort": 60332
        },
        "UnlockWallet": {
          "Path": "consensus.json",
          "Password": "Template.Api123", //Set Your Password
          "IsActive": false
        },
     "PluginURL": "https://github.com/neo-project/neo-modules/releases/download/v{1}/{0}.zip"
      },  
      "ProtocolConfiguration": {
        "Network": 6309138, //external client, mainNet 860833102
        "AddressVersion": 53,
        "MillisecondsPerBlock": 15000,
        "MaxTransactionsPerBlock": 512,
        "MemoryPoolMaxTransactions": 50000,
        "MaxTraceableBlocks": 2102400,
        "InitialGasDistribution": 5200000000000000,
        "ValidatorsCount": 1,
        "StandbyCommittee": [
          "039f1527340d65910fbf89c6843ebdb324312ad7a8ff1b93f689d76d62eab94451"
        ],
        "SeedList": [ 
        ]
      }
    }
  1. Run neo-cli from the folder that you copied in earlier step
  2. Execute the following command in neo-cli console create wallet consensus.json The system will ask to enter password and confirm password. Keep this password safe
  3. Notedown the public key that is generated during step 4. As it will be required in other steps
  4. Close neo-cli console application
  5. Re-open config file
  6. set unlock wallet portion with the following setting "UnlockWallet": { "Path": "consensus.json", "Password": "Enter password that you set during wallet creation", //Set Your Password "IsActive": true },
  7. Set Ports. The ports must not be used by any other application "P2P": { "Port": 60331, "WsPort": 60332 },
  8. Set portocol port. Note the Network number
"ProtocolConfiguration": {
 "Network": 6309138, //external client, mainNet 860833102
 "AddressVersion": 53,
 "MillisecondsPerBlock": 15000,
 "MaxTransactionsPerBlock": 512,
 "MemoryPoolMaxTransactions": 50000,
 "MaxTraceableBlocks": 2102400,
 "InitialGasDistribution": 5200000000000000,
 "ValidatorsCount": 1,
 "StandbyCommittee": [
  "039f1527340d65910fbf89c6843ebdb324312ad7a8ff1b93f689d76d62eab94451" //public key that is generated during wallet creation
 ],
 "SeedList": [ 
 ]
}
  1. set the public key in StandbyCommittee portion
  2. Now move to plugin folder. Open each plugin config file and set same Network (6309138) in all plugins. If there is any port in config file, make sure that port is not used by any other application
  3. That's it neo-cli configurations are done
  4. You need to setup some of the settings in appsettings of the project.
 /** Neo Blockchain Settings */
  "BlockChain": {
    "RunBlockChain": true, /* [true, false, default(false or no setting at all)] If this project requires a block chain the set value to true. The system will run block chain */
    "NeoCliPath": "Neo-cli\\neo-cli.exe", /* Neo cli executable path */
    "WalletPassword": "Template.Api123", /* Password that is used during creation of wallet*/
    "WalletName": "consensus.json", /* Wallet Name */
    "PublicKey": "039f1527340d65910fbf89c6843ebdb324312ad7a8ff1b93f689d76d62eab94451" /* Public key that is generated during wallet creation during step 4*/
  },
  1. Add the following code in your Program file just after the ApplicationContext initialized
var context = services.GetRequiredService<ApplicationContext>();
// Neo Cli application startup
var blockConfig = configuration.BlockChainConfiguration();
if (blockConfig.RunBlockChain)
{
 var project = context.BlockChainServerInfo.FirstOrDefault(f => f.ProjectName == Namespace);
 if (project == null)
 {
  var now = DateTime.Now;
  context.BlockChainServerInfo.Add(new Domain.Entities.BlockchainServerInfo
  {
   Id = Guid.NewGuid(),
   ProjectName = Namespace,
   CreatedOn = now,
   ModifiedOn = now,
   WalletPublicKey = blockConfig.PublicKey,
   WalletPassword = blockConfig.WalletPassword,
   WalletPath = blockConfig.GetWalletPath(),
  });
  context.SaveChanges();
 }
 //var appGuid = NeoProcessMonitor.ApplicationGuid();
 NeoProcessMonitor.ProcessClosed += (sender, args) =>
 {
  if (NeoProcessMonitor.RunBlockChainIfNotRunning(configuration))
   Log.Information("Neo cli is started...");
  else
   Log.Information("Neo Cli is already running...");
 };
 if (!NeoProcessMonitor.IsBlockChainRunning(configuration))
 {
  if (NeoProcessMonitor.RunBlockChainIfNotRunning(configuration))
   Log.Information("Neo cli is started...");
  else
   Log.Information("Neo Cli is already running...");
 }
}
  1. Repeat Step 1 before starting your api project

tsvector, tsquery Full Text Search

PostgreSQL has built-in support for full-text search, which allows you to conveniently and efficiently query natural language documents.

As I said earlier the full text search is builtin feature of postgres. We don't need to create or import functions related to tsvector/tsquery, as I did in TimescaleDB.

I have added an example in User Managment project. The sample consists of one table called TsVectorQuery

public class TsVectorQuery: BaseEntity
{
 public string Name { get; set; }
 public string Description { get; set; }
 public NpgsqlTsVector SearchVector { get; set; }
}

Note

Don't use this sample in live project. The sample is used to demonstrate how to use/utilize -tsvector/tsquery in your project. If tsvector/tsquery is not required. then remove the code related to tsvector/tsquery.

User must remove 20220106060613_TsVectorQuery_Added migration

tsvector/tsquery Sample walkthroug

The following are the classes that are used for tsvector/tsquery

  • TsVectorQuery actual entity class
  • IFullTextSearchService, FullTextSearchService service related classes The IFullTextSearchService is registered in the startup class of Usermanagment.Api project
/* Full text search service registration. It is using postgres tesvetor and tsquery features */
services.AddScoped<IFullTextSearchService, FullTextSearchService>();
  • TextSearchController controller class
  • ApplicationContext The following code is used to set indexer on Description and Name columns
#region TsVector & TsQuery sample code for TsVectorQuery Entity
modelBuilder.Entity<TsVectorQuery>()
 .HasGeneratedTsVectorColumn(
  p => p.SearchVector,
  "english",  // Text search config
  p => new { p.Name, p.Description })  // Included properties
 .HasIndex(p => p.SearchVector)
 .HasMethod("GIN"); // Index method on the search vector (GIN or GIST)
#endregion
  • Last we set entity in ApplicationContext
public DbSet<TsVectorQuery> TsVectorQueries { get; set; }

We can query tsvector/tsquery column in our linq query like that

return _context.TsVectorQueries
    .Where(f => f.SearchVector.Matches(keywords));

Or by using extension functions that are provided by Postgres

return _context.TsVectorQueries
    .Where(f => EF.Functions.ToTsVector("english", f.Name + " " + f.Description)
     .Matches(keywords));

Postgres provides us a very healthy functions list that can be utilized in various scenarios

EntityframeworkCore Migration

add-migration -context Template.Business.DataAccess.ApplicationContext TsVectorQuery_Added

The above command will add a new migration with the name TsVectorQuery_Added. Set name initial or firstMigration in case it is the initial/first migration.

One don't need -context switch when there is only one DbContext present in the project

Showing the top 20 packages that depend on Witline.Core.Api.

Packages Downloads
CategoryManagement.Api
Package Description
9

The project has base api related functionality. e.g. some base controllers some generic controllers etc.

Version Downloads Last updated
3.1.19 32 01/23/2025
3.1.18 66 01/22/2025
3.1.17 40 01/21/2025
3.1.16 19 01/20/2025
3.1.15 12 01/15/2025
3.1.14 6 01/15/2025
3.1.13 8 01/14/2025
3.1.12 8 01/13/2025
3.1.11 13 01/13/2025
3.1.10 10 01/11/2025
3.1.9 6 01/10/2025
3.1.8 8 01/10/2025
3.1.7 8 01/10/2025
3.1.6 9 01/10/2025
3.1.5 8 01/10/2025
3.1.4 9 01/10/2025
3.1.3 10 01/08/2025
3.1.2 7 01/08/2025
3.1.1 9 01/08/2025
3.1.0 20 01/07/2025
3.0.60 21 09/28/2024
3.0.59 8 09/28/2024
3.0.58 14 09/28/2024
3.0.57 20 09/25/2024
3.0.56 10 09/23/2024
3.0.55 9 09/21/2024
3.0.54 8 09/21/2024
3.0.53 11 09/20/2024
3.0.52 8 09/19/2024
3.0.51 10 09/14/2024
3.0.50 8 09/14/2024
3.0.42 10 07/25/2024
3.0.41 18 07/22/2024
3.0.40 11 07/10/2024
3.0.39 9 07/08/2024
3.0.38 9 07/04/2024
3.0.37 9 07/02/2024
3.0.36 10 07/02/2024
3.0.35 15 06/28/2024
3.0.34 8 06/28/2024
3.0.33 18 06/25/2024
3.0.32 19 06/13/2024
3.0.31 17 06/11/2024
3.0.30 14 06/08/2024
3.0.29 16 06/03/2024
3.0.28 6 06/03/2024
3.0.27 8 05/17/2024
3.0.26 19 05/16/2024
3.0.25 17 05/11/2024
3.0.24 20 04/08/2024
3.0.23 10 04/01/2024
3.0.22 11 03/28/2024
3.0.21 9 03/28/2024
3.0.20 7 03/28/2024
3.0.19 8 03/27/2024
3.0.18 9 03/27/2024
3.0.17 7 03/27/2024
3.0.16 10 03/25/2024
3.0.15 12 03/25/2024
3.0.14 11 03/24/2024
3.0.12 15 02/24/2024
3.0.11 10 02/21/2024
3.0.10 13 02/16/2024
3.0.9 10 02/15/2024
3.0.8 19 02/14/2024
3.0.7 8 02/06/2024
3.0.6 12 02/03/2024
3.0.5 10 01/31/2024
3.0.4 11 01/29/2024
3.0.3 14 01/26/2024
3.0.2 6 01/24/2024
3.0.1 6 01/23/2024
3.0.0 16 01/18/2024
1.0.56 10 11/17/2023
1.0.55 7 11/15/2023
1.0.54 32 11/04/2023
1.0.53 7 11/03/2023
1.0.52 10 10/02/2023
1.0.51 9 09/18/2023
1.0.50 14 08/26/2023
1.0.49 8 08/22/2023
1.0.48 8 08/19/2023
1.0.47 8 08/19/2023
1.0.46 8 08/18/2023
1.0.45 18 08/12/2023
1.0.44 8 08/10/2023
1.0.43 10 08/07/2023
1.0.42 8 08/07/2023
1.0.41 7 08/05/2023
1.0.40 9 08/03/2023
1.0.39 20 08/02/2023
1.0.38 19 07/27/2023
1.0.37 8 07/26/2023
1.0.36 59 07/19/2023
1.0.34 71 05/23/2023
1.0.33 8 05/19/2023
1.0.32 9 05/16/2023
1.0.31 8 05/15/2023
1.0.30 7 05/15/2023
1.0.29 13 05/12/2023
1.0.28 9 05/09/2023
1.0.26 84 03/28/2023
1.0.25 6 03/28/2023
1.0.24 9 03/27/2023
1.0.23 20 03/14/2023
1.0.22 10 03/14/2023
1.0.21 8 03/14/2023
1.0.20 8 03/08/2023
1.0.19 15 02/28/2023
1.0.18 14 02/20/2023
1.0.17 10 02/20/2023
1.0.16 8 02/14/2023
1.0.15 17 02/09/2023
1.0.14 21 01/31/2023
1.0.13 7 01/31/2023
1.0.12 17 01/27/2023
1.0.11 7 01/27/2023
1.0.10 6 01/26/2023
1.0.9 9 01/24/2023
1.0.8 11 12/26/2022
1.0.7 12 11/28/2022
1.0.6 16 11/20/2022
1.0.5 10 11/15/2022
1.0.4 9 11/15/2022
1.0.3 10 11/08/2022
1.0.2 13 10/26/2022
1.0.1 6 10/26/2022
1.0.0.1 47 09/13/2022
1.0.0 23 09/06/2022