Tuesday, May 12, 2015

Simple Router

In this post I am going to discuss about the Router. Lot of times in an application, we want to send different kinds of messages and have them delivered to the appropriate receiver based on some rule or type etc.

This is also similar to what we have MVC, where you register the routes and then based on the appropriate routes, the request will be delivered the appropriate client. Azure Service Bus provides similar functionality with Subscription/Topics. It is little different from Publisher-Subscriber or Observer, in a sense that there are multiple senders and multiple receivers.




This is a simple implementation of that.

https://gist.github.com/loneshark99/855209c7aa5d645516c5

Thanks,
Yash

Saturday, April 4, 2015

Sharing files between Linux and Windows using Samba

I wanted to share somefiles from the Linux Ubuntu Machine and Windows 8.1 Machine. Samba makes this really easy to do this. Samba is a free software and its an implementation of the Server Message Block protocol which is used in networking.

On the Linux machine first you need to install Samba.

sudo apt-get install samba

You can check if the samba service is installed/running or not.

service --status-all | grep samba

Once samba is installed you  need to update the samba config file so that you can setup your share details.  First take backup of the samba.conf file, incase something gets messed up. Then update the samba.conf file with just these details.

























As you can see I have setup the Path for sharing as /home/yash/Downloads and workgroup is the windows machine workgroup.  The share name will be LinuxShare. One important thing to note is, if you want to setup passwords, you will first have to create an samba user and the add them to the valid users list.

sudo smbpasswd -a yash








-Yash


Sunday, March 29, 2015

ssh without using password

I have 3 laptops at home, a Linux Server and 2 Windows 8 Machine and a switch back and forth between them.  I have installed ssh-server on the Linux Server  and from windows I ssh into it and then work on it.

Today's blog entry is very simple. Since I didn't want to enter my password every time I login into the server.  These are the simple steps you can follow and get the basic understanding.

You need to have SSH service running on the Linux Server, you can quickly check that by running this command

service --status-all | grep "ssh"

 Then on the Windows Server download my fvt XTerminal on Windows, which is MobaXterm which you can download by clicking here.

1) The main thing is on the client you need to generate a set of ( Public + PrivateKey) using RSA encrypting algorithm (preferable).

ssh-keygen -t rsa

This will create 2 files (id_rsa, id_rsa.pub)   which is the Private and PublicKey respectively.

2) Then you need you copy the public key on to the Linux Server. You can do 2 using the following command.

ssh-copy-id Username@Host

3) Once you have copied the file, you need to append the contents of the file into .ssh/authorized_keys file.



4) Now you can use the same private key from any of the Windows Boxes and Login into the Linux Server without the password.

On the Linux this the Authorized file contents.



When you generate the private and public key, you will get an interactive prompt to use password, you can use one if you like to more secure, incase the privateKey file gets compromised.

You can view the ssh logs if you encounter any error while logging in . Its under /var/llog/auth.log file.


Thanks,
Yash

Sunday, November 16, 2014

SQL Server table as a Queue with SQL Hints

Queues can be implemented in SQL Sever using Tables. Consider this scenario, we have 1 Producer and N number of consumers which pick up the job and Execute it. There are issues with this that needs to be take care of

There are 3 concurrency issues we have to keep on eye on in any asynchronous application in Sql Server, namely

1) Dirty Reads  ( Doesn't happen in READ COMMITTED isolation level)
2) Non repeatable Reads (Can happen in READ COMMITTED isolation level)
3) Phantom Reads  ( Can Happen in READ COMMITTED isolation level)

In this scenario, the issue is

1) Consumers can pickup the same job and start executing it, without knowing other consumer has picked up the job.

CREATE DATABASE YashTempDb
GO
DROP TABLE [Queue]
GO
Create Table [Queue]
(
JobId INT IDENTITY PRIMARY KEY,
JobName NVARCHAR(256),
IsExecuting bit,
LastExecutionTime DATETIME2
)
GO
INSERT INTO Queue(JobName,IsExecuting,LastExecutionTime)
VALUES
('One',0,DateAdd(minute,0,GETDATE())),
('Two',0,DateAdd(minute,-1,GETDATE())),
('Three',0,DateAdd(minute,-2,GETDATE())),
('Four',0,DateAdd(minute,-3,GETDATE())),
('Five',0,DateAdd(minute,-4,GETDATE())),
('Six',0,DateAdd(minute,-5,GETDATE())),
('Seven',0,DateAdd(minute,-6,GETDATE())),
('Eight',0,DateAdd(minute,-7,GETDATE())),
('Nine',0,DateAdd(minute,-8,GETDATE())),
('Ten',0,DateAdd(minute,-9,GETDATE()))
GO
Declare @JobId INT
Declare @Count INT
SET @Count = 0;

WHILE (@Count < 5)
BEGIN
BEGIN TRAN
  SET NOCOUNT ON

   -- Pick up a job 
   SET @JobId = (Select TOP 1 JobId from [Queue] WITH (ROWLOCK,READPAST)  WHERE IsExecuting = 0 order by LastExecutionTime asc)
   -- Print the pickedup jobId 
   PRINT @JobId
   -- SET THE FLAG TO EXECUTING
   UPDATE [QUEUE] WITH (ROWLOCK)
   SET IsExecuting = 1
   WHERE JobId = @JobId
   PRINT 'SET THE ISEXECTUING FLAG TO EXECUTING'
   -- SIMULATE THE JOB TIME (Move this out of transaction since you dont want to lock the table for that duration)
   WAITFOR DELAY '00:00:10'
   -- Update the JobId after executing it
   UPDATE [Queue] WITH (ROWLOCK)
   SET LastExecutionTime = DateAdd(Minute,1,GETDATE()), IsExecuting = 0
   WHERE JobId = @JobId

   PRINT Concat('COMPLETED ', @JobId)
   SET @Count = @Count + 1
      COMMIT TRAN
END


Using the sql query hint  (ROWLOCK and READPAST), we can gurantee that the same row will not be picked up by the other transaction.

READPAST -- (As per MSDN Documentation)  "Specifies that the Database Engine not read rows that are locked by other transactions. When READPAST is specified, row-level locks are skipped. That is, the Database Engine skips past the rows instead of blocking the current transaction until the locks are released. READPAST is primarily used to reduce locking contention when implementing a work queue that uses a SQL Server table. A queue reader that uses READPAST skips past queue entries locked by other transactions to the next available queue entry, without having to wait until the other transactions release their locks.

Hope this is useful and will help to recude concurrency issues with queues.

Thanks,
Yash

Tuesday, July 29, 2014

Running Total

Often when you are creating reports you want to do the Running Total  eg. Sum = Current Sum + (Sum of all the preceding values).

There are many ways to do this. I just wanted to outline some of the standard techniques used to do this and the new feature in 2012 which makes it very easy and fast.

1) Using Non-Equi Joins

Select A.PhysicalFileId,A.BytesWritten,A.CreatedDate, 
Sum(B.BytesWritten)as RunningTotal from #X A
inner join #X B  on B.PhysicalFileId <= A.PhysicalFileId
group by A.PhysicalFileId, A.BytesWritten,A.CreatedDate
2) My personal favorite -- Correlated Query

Select A.PhysicalFileId, A.BytesWritten,A.CreatedDate , 
(Select Sum(B.BytesWritten) from #X B where B.PhysicalFileId <= A.PhysicalFileId) as RunningTotal
from #X  A
3) Available in SQL 2012 and beyond and fastest of all, using Window Functions

Select A.PhysicalFileId,A.BytesWritten,A.CreatedDate, Sum(A.BytesWritten) Over( 
Order by A.PhysicalFileId
) As RunningTotal from #X AS A

You can find the sql queries at :: http://sqlfiddle.com/#!6/40d6e/3/0

Until next time!

-Yash

Sunday, July 27, 2014

Pivot table and its uses

Recently I started working on a BI project, which is a new domain for me. I always wanted to work in a Business Intelligence project since that was one area I didn't cover. I have been learning lot of new concepts which are interesting. We generate lot of reports which are used by the management teams to make proper business decisions and for that we use Self Service Power BI. If you have done reporting you will know, Pivot table is one of the most fundamental way of generating Cross-Tabular Reports.

Pivot tables are basically used for 2 purpose

1) To generate cross tabular reporting i.e converting the rows into columns.
2) With Entity-Attribute table ( which is a Key-Value kind of table, like a dictionary)

As an example ::  I have 2 table with one to many relationship, which are used to record Api Success/Failure .

[dbo].[ApiLookUp]  (1) 
[dbo].[ApiHealthMonitoring]   (Many)

You can see the Table Schema and Queries here  :: http://sqlfiddle.com/#!6/d5fbf/3

The best way in my opinion to write Pivot query is to follow these steps

1) Write the Base query i.e the table for which you want to convert rows into columns .
     This becomes your Base Query
2) Select the Aggregate Column from one of the columns of the table, created by the Base query, for the columns you want to pivot on.
3) Select the Columns as the output of the  Pivot Query









One thing worth mentioning about Pivot table is  that Columns in the Pivot table need to be hardcoded. So if the values of the Column on which you are pivoting then you need to either change the query frequently or use dynamic sql.

This same thing can be achieved very easily using the Excel - Power Pivot feature. It is very powerful and simple to use. All you need to do, is to import the tables from the Sql Server Database. Add a pivot table and select the appropriate columns. You can even generate interactive graphs and publish it to SharePoint using power query and power pivot. The deployment story for Excel reports is very powerful and easy to use by even non programmers.

You can also add slicers to the table which help in slicing/dicing of the data. Pretty powerful.








Until next time!

-Yash





Sunday, April 6, 2014

Love for Powershell!

This is a quick blog entry, to appreciate my love for powershell. I have been running into Low disk space on my work computer, and I wanted to fix this issue.

Powershell to the rescue!

Simple powershell command and I was able to get the files which is occupying max disk space and then go from there.

Get-ChildItem -Path C:\ -Recurse | Where-Object {$_.PSIsContainer -eq $false} | 
sort-Object -Property Length -Descending |
Select-Object -First 10 | 
Format-List -Property FullName,Length,
@{name = "SizeInMB";expression = {$_.Length / 1MB}}

I have always been a commadline guy and I really enjoy writing small useful scripts!


Thanks,
Yash