Query for find n th highest salary
SELECT TOP 1 salary
FROM (
SELECT DISTINCT TOP 3 salary
FROM dbo.empsalary
ORDER BY salary DESC) a
ORDER BY salary
To find the size of database
SELECT * FROM SYSFILE
USE URDBNAME
EXEC SP_HELPDB
EXEC SP_SPACEUSED
Create table from existing table
CREATE TABLE tblname (colnm) AS SELECT colnm FROM existingtblname
Precondition: table should already exist in select clause, number of column should be same and data type should be same.
To copy only structure of table not data
CREATE TABLE tblname AS SELECT *FROM existingtblnm WHERE (any false condition)
Example: Create table tblname as select * from existingtblnm where 1=2
To insert record into table from existing table
Precondition: both the table exist and data type must same
INSERT INTO table1 AS SELECT * FROM table2
Rename table
RENAME oldtblnm TO newtblnm
Drop table
DROP tablenm
To identify how many tables are there
SELECT *FROM TAB
To see the fields of table
DESC tablename
DUAL
In build table in oracle, only 1 row and 1 column,for getting output on screen of operations or system date
Example: select 2*2 from dual, select sysdate from dual
For viewing tables, views, SP,function from database
SELECT name,xtype FROM sysobjects WHERE xtype in (‘u’,’v’,’p’,’fn’) AND name NOT LIKE ‘dt%’ ORDER BY xtype
Posted by Trupti Jethva at 7:26 PM 0 comments
Labels: Database Testing
Monday, April 11, 2011
About Normalization
What is Normalization?
Normalization is the process of efficiently organizing data in a database. There are two goals of the normalization process: eliminating redundant data (for example, storing the same data in more than one table) and ensuring data dependencies make sense (only storing related data in a table). Both of these are worthy goals as they reduce the amount of space a database consumes and ensure that data is logically stored.
First Normal Form (1NF)
First normal form (1NF) sets the very basic rules for an organized database:
* Eliminate duplicative columns from the same table.
* Create separate tables for each group of related data and identify each row with a unique column or set of columns (the primary key).
Second Normal Form (2NF)
Second normal form (2NF) further addresses the concept of removing duplicative data:
* Meet all the requirements of the first normal form.
* Remove subsets of data that apply to multiple rows of a table and place them in separate tables.
* Create relationships between these new tables and their predecessors through the use of foreign keys.
Third Normal Form (3NF)
The rules of 3NF are mentioned here
* Make a separate table for each set of related attributes, and give each table a primary key.
* If an attribute depends on only part of a multi-valued key, remove it to a separate table
* If attributes do not contribute to a description of the key, remove them to a separate table.
Fourth Normal Form (4NF)
Finally, fourth normal form (4NF) has one additional requirement:
* Meet all the requirements of the third normal form.
* A relation is in 4NF if it has no multi-valued dependencies.
Remember, these normalization guidelines are cumulative. For a database to be in 2NF, it must first fulfill all the criteria of a 1NF database.
Posted by Trupti Jethva at 11:50 AM 0 comments
Labels: Database Testing
Tuesday, December 28, 2010
Concept of Store Procedure
What is Store Procedure
Stored procedures are precompiled database queries (set of SQL statements) that improve the security, efficiency and usability of database client/server applications. Developers specify a stored procedure in terms of input and output variables. They then compile the code on the database platform and make it available to application developers for use in other environments, such as web applications. All of the major database platforms, including Oracle, SQL Server and MySQL support stored procedures. The major benefits of this technology are the substantial performance gains from precompiled execution, the reduction of client/server traffic, development efficiency gains from code reuse and abstraction and the security controls inherent in granting users permissions on specific stored procedures instead of the underlying database tables.
Benefits of Stored Procedures
• Precompiled execution. SQL Server compiles each stored procedure once and then reutilizes the execution plan. This results in tremendous performance boosts when stored procedures are called repeatedly.
• Reduced client/server traffic. If network bandwidth is a concern in your environment, you'll be happy to learn that stored procedures can reduce long SQL queries to a single line that is transmitted over the wire.
• Efficient reuse of code and programming abstraction. Stored procedures can be used by multiple users and client programs. If you utilize them in a planned manner, you'll find the development cycle takes less time.
• Enhanced security controls. You can grant users permission to execute a stored procedure independently of underlying table permissions.
Stored procedures are very similar to user-defined functions, but there are subtle differences. For more information, read Comparing Stored Procedures and User-Defined Functions.
Structure
Stored procedures are extremely similar to the constructs seen in other programming languages. They accept data in the form of input parameters that are specified at execution time. These input parameters (if implemented) are utilized in the execution of a series of statements that produce some result. This result is returned to the calling environment through the use of a recordset, output parameters and a return code. That may sound like a mouthful, but you'll find that stored procedures are actually quite simple.
Example: SELECT Product, Quantity
FROM Inventory
WHERE Warehouse = 'FL'
CREATE PROCEDURE sp_GetInventory
@location varchar(10)
AS
SELECT Product, Quantity
FROM Inventory
WHERE Warehouse = @location
Our Florida warehouse manager can then access inventory levels by issuing the command
EXECUTE sp_GetInventory 'FL'
The New York warehouse manager can use the same stored procedure to access that area's inventory.
EXECUTE sp_GetInventory 'NY'
How to execute Store Procedure
Exec Storeprocedurename @parm1,@parm2,@parm3
Points to keep in mind when Store Procedure
Some of the points mentioned in this article is getting through the following site. I added them for my future reference of getting all information at one point.
http://www.devx.com/getHelpOn/10MinuteSolution/20057/1954
Spell out your Procedure’s Purpose
A stored procedure works best when doing only one thing. However, some shops are under pressure to localize the logic underlying data modifications. This centralizes code maintenance by putting the logic in one procedure instead of three or four (e.g., procedures for insert, update, and delete). Unfortunately, combining these actions makes the underlying procedure exponentially more complicated. Instead of doing only one thing, the procedure is pressed into general service, doing three or four things. Your first line of defense in debugging this type of general-purpose stored procedure is a good header that enables you to document its purpose(s). Once the header is set up and maintained, you have a running history of what is happening in the code.
Header contains details like Author, Date, Last Rev, Revision History, Purpose, Creates, uses, parameters in, parameters out , Relations, Good Joins, Parking Lot
Determine the Flow of Control
By branching based on the parameters sent to the procedure, you delegate control to the relevant sections to do the DML required. Assume you have a procedure that has one parameter: @Action, which could be an Insert, Update, or Delete parameter. To test the value of @Action and perform accordingly, you need a structure something like this:
IF @Action = XXXXXX
BEGIN
Your code here
END
You should get this fleshed out by pseudo-coding your intention first and then creating the logic to accomplish what you intend. Since most errors are logic errors, getting this down in stub form will assure that the code is at least executing the flow of control correctly.
Basic things needs to check while Testing Store Procedure
* Naming convention of Store procedure
* Whether SP is part of DB or not
* Name of parameter, No of parameters, type of parameters
* Check for the Outputs
o When output is zero (zero row affected)
o When some records are extracted
o Output contains many records
o What a stored procedure is supposed to do and not supposed to do
o Write a simple queries to see if a stored procedure populates right data
* Check for the input parameters
* Check for the valid data for each parameter
* Check for the invalid data for each parameter
* Check for the boundary value of data
* Check for all the required parameters
* Check parameter for data type
* Check for the Return values
* Check if SP return any values
* Check if failure occurs then it should return nonzero
* Check for the Error Messages
o Make stored procedure fail and cause every error message to occur at least once
o Find out any exception that doesn’t have a predefined error message
* Check for other values
* Whether a stored procedure grants correct access privilege to a group/user
* See if a stored procedure hits any trigger error, index error, and rule error
* Look into a procedure code and make sure major branches are test covered.
* Integration check for the procedure
* Group related stored procedures together. Call them in particular order
* If there are many sequences to call a group of procedures, find out equivalent classes and run tests to cover every class.
* Make invalid calling sequence and run a group of stored procedures.
* Design several test sequences in which end users are likely to do business and do stress tests
Let the System Do the Work with everything
By creating a parameter mill, you can feed a stored procedure a collection of parameters that represent the universe of things it can accept, enabling you to test the scope of conditions anticipated by your code. You can automate the boundary conditions with a table and procedure to do the drudgery for you.
CREATE PROCEDURE dbo.usp_TestFacilityRoster
@InAction VARCHAR(15),
@EntryMonth INT,
@EntryYear I
@FileType CHAR(1)
AS
IF @InAction ='NEWREINS' or @InAction = 'NEWREINSSNP'
BEGIN
--do stuff for new members
END
IF @InAction = 'DISENROL' OR @InAction = 'EXISTING'
BEGIN
--do common stuff for existing or disenrolled members
IF @InAction = 'DISENROL'
BEGIN
--specific for disenroll
ELSE
IF @InAction = 'EXISTING'
BEGIN
--specific for existing
END
Create a test plan table with the following code
create table p1 (param1 char(8))
create table p2 (param2 int)
create table p3 (param3 int)
create table p4 (param4 char(1))
create table testplan (param1 char(8), param2 int, param3 int, param4 char(1))
insert p1 values('NEWREINS')
insert p1 values('DISENROL')
insert p1 values('EXISTING')
insert p2 values(1)
insert p2 values(2)
insert p2 values(3)
insert p2 values(4)
insert p2 values(5)
insert p2 values(6)
insert p2 values(7)
insert p2 values(8)
insert p2 values(9)
insert p2 values(10)
insert p2 values(11)
insert p2 values(12)
insert p3 values(1999)
insert p3 values(2000)
insert p3 values(2001)
insert p3 values(2002)
insert p3 values(2003)
insert p3 values(2004)
insert p4 values('A')
insert p4 values('S')
insert p4 values('P')
insert testplan select * from p1,p2,p3,p4
This will create a Cartesian Product (cross join) of all the combinations of the parameters your procedure can accept.
During development, you try to mimic the call to your procedure by writing something like this near the top of the code:
/* TEST BLOCK
set @InAction ='NEWREINS'
set @entrymonth = 7
set @entryyear = 2003
*/
Instead, you can create a string to execute the stored procedure that leaves placeholders for your procedure's parameters:
select 'Execute dbo.usp_TestFacilityRoster('+
''''+
param1 + ''''+','
+cast(param2 as char(2))
+','+
+cast(param3 as char(4))
+','
+''''+param4+''''
+')'
from testplan
This code doesn't execute it though. You need to use dynamic SQL to execute the string. So the above code can be wrapped in a cursor like this:
set nocount on
declare @param1 varchar(25)
,@param2 varchar(2)
,@param3 varchar(4)
,@param4 varchar(25)
declare C1 Cursor for
select param1, param2, param3, param4
from testplan (nolock)
open C1
fetch next from C1
into @param1, @param2, @param3, @param4
while @@fetch_status = 0
begin
exec dbo.usp_TestFacilityRoster, @param1, @param2, @param3,
@param4
fetch next from C1
into @param1, @param2, @param3, @param4
end
close C1
deallocate C1
-- this statement builds the string from the table
Update Testplan
Set @sqlstring2 = @sqlstring2 + cast(param1 as char(8)) + ', '
+cast(param2 as char(2)) + ', ' + cast(param3 as char(4))+ ', '
+cast(param4 as char(1))
print @sqlstring2
set @sqlstring = @sqlstring + @sqlstring2
execute sp_executesql @sqlstring
You also can use this approach from the command line and imitate the testing procedure outlined above by using the FOR IN DO construct in a batch file, or in better flow of control language from WSH so that you can make repeated calls to your procedure with the different parameter sets. However, the approach given here lets you work from within Query Analyzer without having to resort to the command window or another language to get the job done.
These ideas were necessary for me to circumvent the absence of permissions to run the SQL Debugger or to install my own error messages so that I could use 'raise error' in my assignment. This Solution has provided a few ideas for getting around similar limitations in your own environment, and they lets you test your code automatically to have a higher degree of confidence in the correctness of your own solutions
Posted by Trupti Jethva at 10:40 PM 0 comments
Labels: Database Testing
Thursday, December 9, 2010
Concept of JOIN
Concept of JOIN:
The JOIN keyword is used in an SQL statement to query data from two or more tables, based on a relationship between certain columns in these tables.
INNER JOIN
This join returns rows when there is at least one match in both the tables.
Example:
SELECT * FROM Table1 T1 INNER JOIN Table2 T2 ON T1.col1=T2.col1
OUTER JOIN
There are three different Outer Join methods.
LEFT OUTER JOIN
This join returns all the rows from the left table in conjunction with the matching rows from the right table. If there are no columns matching in the right table, it returns NULL values.
Example:
SELECT * FROM Table1 T1 LEFT OUTER JOIN Table2 T2 ON T1.col1=T2.col1
RIGHT OUTER JOIN
This join returns all the rows from the right table in conjunction with the matching rows from the left table. If there are no columns matching in the left table, it returns NULL values.
Example:
SELECT * FROM Table1 T1 RIGHT OUTER JOIN Table2 T2 ON T1.col1=T2.col1
FULL OUTER JOIN
This join combines left outer join and right after join. It returns row from either table when the conditions are met and returns null value when there is no match.
Example:
SELECT * FROM Table1 T1 FULL OUTER JOIN Table2 T2 ON T1.col1=T2.col1
CROSS JOIN
This join is a Cartesian join that does not necessitate any condition to join. The resultset contains records that are multiplication of record number from both the tables.
Below is good reference link for concept of JOIN.
http://blog.sqlauthority.com/2009/04/13/sql-server-introduction-to-joins-basic-of-joins/
Posted by Trupti Jethva at 5:23 PM 0 comments
Labels: Database Testing
Monday, August 2, 2010
Database Queries
Some useful Queries
1. Query for find n th highest salary
SELECT TOP 1 salary
FROM (
SELECT DISTINCT TOP 3 salary
FROM dbo.empsalary
ORDER BY salary DESC) a
ORDER BY salary
2. To find the size of database
SELECT * FROM SYSFILE
USE URDBNAME
EXEC SP_HELPDB
EXEC SP_SPACEUSED
3. To generate test data for generating load on database with create tables and insert record in it
Creation of employee table
USE [tempdb]
GO
/****** Object: Table [dbo].[Emp] Script Date: 03/06/2009 17:09:08 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
SET ANSI_PADDING ON
GO
CREATE TABLE [dbo].[Emp](
[no] [numeric](18, 0) NOT NULL,
[Temp1] [varchar](50) NOT NULL,
[Temp2] [varchar](50) NOT NULL,
[Temp3] [varchar](50) NOT NULL,
[Temp4] [varchar](50) NOT NULL,
[Temp5] [varchar](50) NOT NULL,
[Temp6] [varchar](50) NOT NULL,
[number] [numeric](18, 0) NOT NULL,
[date] [datetime] NOT NULL,
CONSTRAINT [PK_Emp] PRIMARY KEY CLUSTERED
(
[no] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO
SET ANSI_PADDING OFF
GO
Creation of Employee child table
USE [Test]
GO
/****** Object: Table [dbo].[EmpChild] Script Date: 03/06/2009 17:12:34 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
SET ANSI_PADDING ON
GO
CREATE TABLE [dbo].[EmpChild](
[No] [numeric](18, 0) NOT NULL,
[Temp1] [varchar](50) NOT NULL,
[Temp2] [varchar](50) NOT NULL,
[date] [datetime] NOT NULL,
CONSTRAINT [PK_EmpChild] PRIMARY KEY CLUSTERED
(
[No] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO
SET ANSI_PADDING OFF
GO
ALTER TABLE [dbo].[EmpChild] WITH CHECK ADD CONSTRAINT [FK_EmpChild_Emp] FOREIGN KEY([No])
REFERENCES [dbo].[Emp] ([no])
GO
ALTER TABLE [dbo].[EmpChild] CHECK CONSTRAINT [FK_EmpChild_Emp]
GO
Insert Query for 50000 records (n could be any number)
-- Driver:
DECLARE
@i INTEGER,
@i1 INTEGER
BEGIN
SET @i = 0
SET @i1 = 1
WHILE @i1 <= 50
BEGIN
INSERT INTO emp
VALUES (@i1, 'Temp1', 'Temp2', 'Temp3', 'Temp4', 'Temp5', 'Temp6',@i1,getdate())
SET @i1 = @i1 + 1
END
END
GO
-- Child
DECLARE
@i1 INTEGER
BEGIN
SET @i1 = 1
WHILE @i1 <= 50
BEGIN
INSERT INTO empchild
VALUES (@i1, 'Temp1', 'Temp2', getdate())
SET @i1 = @i1 + 1
END
END
GO
select count(*) from emp;
select count(*) from empchild;
select * from emp;
select * from empchild;
Posted by Trupti Jethva at 5:11 PM 0 comments
Labels: Database Testing
Wednesday, April 21, 2010
Data Migration Testing concept and checklist for testing
Data Migration Testing
Data migration is a process which involves the migration of data from an existing database to a new database. Whenever an organization decides to upgrade or change its database, it will need to transport the existing data to the new database.
The scope of data migration is much more than what the term suggests. Data migration activity typically will include everything that is required with respect to data to ensure that the new database is up and running without any defects and it also contains all the legacy data ( data that is present in the existing database) , and the data has been migrated to correct table(s) and column(s).
Consequently it involves more activities than only migration of existing data as the name suggests.
In a typical scenario it can include following activities:-
A. Migration of existing data to the new database.
B. The new database can have some 'Master Tables'. And Master tables need to be populated with data first to account for various referential constraints. In such cases data migration includes the activity of setting up (master) data in master tables.
C. The new database can have some tables or attributes which are not present in the existing database. Thus a completely new set of data will be setup in these tables.
While data migration is different, first understand the data migration process. Data migration is the process of shifting the whole production data from one database to other new database. This process might be done by using data migration tool which is developed as per the data migration requirements.
During this data migration process there may be possibility of data missing or data corruption. So after the data migration process one has to test whether the whole data is migrated successfully or not, whether something is missing or gets currepted during migration.
So data migration testing is to compare migrated data with original data to find out any discrepancies.
When we are speaking database migration testing. We can say ETL testing. Database migration is extracting the database from source and loads the data to target database.
Two types of Loading:
1. Direct Loading.
2. Incremental Loading.
Direct Loading:
The direct loading is nothing copy the data from source to target. The number of rows should be equal source and target.
Incremental Loading:
The Incremental loading, when loading the data from source to target, applying some transformations. Cleaning the data or load the data with specific conditions.
Challenge is two different databases have their own structures so just simple table/record import export function will not work. We need a tool (say ETL) in between source database and destination database where we will write the transformation logic, done by the Dev team. Testing team will validate that all those business rules in ETL are actually working in migration testing
Types of Migration
Migration of data from one database to another can be further classified into various subsets.
a. Direct Migration from the source database to the target:
In this type of migration the data elements that are present in the source database is directly transferred into the target database
b. Migration after performing certain transformations on the source data:
This migration requires some transformation of the source data before it is migrated to the new database.
For e.g., a field ‘Priority’ in the source database may allow only one out of the three values High, Medium and Low. But in the target database the same field allows one out of the three values 1, 2 and 3. Thus the a High value in the source database will be transformed to a value 1 and will then be migrated to the target database.
Different types of migrations are listed below. These are the migrations which were encountered during testing of data migration project(s). There could be some more types of migration and presently are not part of the below list.
1. Change in datatype.
a. Data in the source table may be present as a character, but in the target table the mapped column is an integer.
2. Concatenation of two or more columns.
3. Mathematical transformations (e.g. summing up of fields etc)
4. Migration of values depending on other fields/tables present in the source database.
c. Target Generated Data:
This is the data that is generated at the target database. These values will not be present the source database but will be generated each time an insertion or an update takes place on a table. These columns generally are the primary key of a table.
d. Reference at target system generated.
This type of migration deals with data that has to comply with referential integrity constraints. This data is generated at the new database but the data comes from the parent attribute. A reference key in the target database is usually encountered when a table in the source database is split up into two or more tables in the target.
e. Manual setup data in the target database.
Certain tables have to be populated before the migration takes place. These tables can be termed as ‘Master Tables’. These tables are the parent tables to the various referential constraints that are to be implemented in the child tables. The data for these tables may not come from the source database but will be setup manually in the target database.
Checklist for test Data Migration Testing
1. We can check for the table row count.
2. We have to check database integrity.
3. We have to check the referential integrity.
4. Whether the data are loaded in the target database correctly.
5. Check the constraints.
6. All the entities are transformed or not like tables, SPs, Views, functions etc
7. Check the functionality of your application with target database, it should work as it is supposed to work
Data Migration Testing Life Cycle
The following are the steps to be followed in the process of Data Migration Testing:-
1. Data Migration Requirement Analysis
2. Test Strategizing
3. Creation of POC
4. Test Planning
5. Design of Automated Test Scripts
6. Test Environment Setup
7. Test Execution
8. Result and Defect Analysis
Useful links:
SELECT TOP 1 salary
FROM (
SELECT DISTINCT TOP 3 salary
FROM dbo.empsalary
ORDER BY salary DESC) a
ORDER BY salary
To find the size of database
SELECT * FROM SYSFILE
USE URDBNAME
EXEC SP_HELPDB
EXEC SP_SPACEUSED
Create table from existing table
CREATE TABLE tblname (colnm) AS SELECT colnm FROM existingtblname
Precondition: table should already exist in select clause, number of column should be same and data type should be same.
To copy only structure of table not data
CREATE TABLE tblname AS SELECT *FROM existingtblnm WHERE (any false condition)
Example: Create table tblname as select * from existingtblnm where 1=2
To insert record into table from existing table
Precondition: both the table exist and data type must same
INSERT INTO table1 AS SELECT * FROM table2
Rename table
RENAME oldtblnm TO newtblnm
Drop table
DROP tablenm
To identify how many tables are there
SELECT *FROM TAB
To see the fields of table
DESC tablename
DUAL
In build table in oracle, only 1 row and 1 column,for getting output on screen of operations or system date
Example: select 2*2 from dual, select sysdate from dual
For viewing tables, views, SP,function from database
SELECT name,xtype FROM sysobjects WHERE xtype in (‘u’,’v’,’p’,’fn’) AND name NOT LIKE ‘dt%’ ORDER BY xtype
Posted by Trupti Jethva at 7:26 PM 0 comments
Labels: Database Testing
Monday, April 11, 2011
About Normalization
What is Normalization?
Normalization is the process of efficiently organizing data in a database. There are two goals of the normalization process: eliminating redundant data (for example, storing the same data in more than one table) and ensuring data dependencies make sense (only storing related data in a table). Both of these are worthy goals as they reduce the amount of space a database consumes and ensure that data is logically stored.
First Normal Form (1NF)
First normal form (1NF) sets the very basic rules for an organized database:
* Eliminate duplicative columns from the same table.
* Create separate tables for each group of related data and identify each row with a unique column or set of columns (the primary key).
Second Normal Form (2NF)
Second normal form (2NF) further addresses the concept of removing duplicative data:
* Meet all the requirements of the first normal form.
* Remove subsets of data that apply to multiple rows of a table and place them in separate tables.
* Create relationships between these new tables and their predecessors through the use of foreign keys.
Third Normal Form (3NF)
The rules of 3NF are mentioned here
* Make a separate table for each set of related attributes, and give each table a primary key.
* If an attribute depends on only part of a multi-valued key, remove it to a separate table
* If attributes do not contribute to a description of the key, remove them to a separate table.
Fourth Normal Form (4NF)
Finally, fourth normal form (4NF) has one additional requirement:
* Meet all the requirements of the third normal form.
* A relation is in 4NF if it has no multi-valued dependencies.
Remember, these normalization guidelines are cumulative. For a database to be in 2NF, it must first fulfill all the criteria of a 1NF database.
Posted by Trupti Jethva at 11:50 AM 0 comments
Labels: Database Testing
Tuesday, December 28, 2010
Concept of Store Procedure
What is Store Procedure
Stored procedures are precompiled database queries (set of SQL statements) that improve the security, efficiency and usability of database client/server applications. Developers specify a stored procedure in terms of input and output variables. They then compile the code on the database platform and make it available to application developers for use in other environments, such as web applications. All of the major database platforms, including Oracle, SQL Server and MySQL support stored procedures. The major benefits of this technology are the substantial performance gains from precompiled execution, the reduction of client/server traffic, development efficiency gains from code reuse and abstraction and the security controls inherent in granting users permissions on specific stored procedures instead of the underlying database tables.
Benefits of Stored Procedures
• Precompiled execution. SQL Server compiles each stored procedure once and then reutilizes the execution plan. This results in tremendous performance boosts when stored procedures are called repeatedly.
• Reduced client/server traffic. If network bandwidth is a concern in your environment, you'll be happy to learn that stored procedures can reduce long SQL queries to a single line that is transmitted over the wire.
• Efficient reuse of code and programming abstraction. Stored procedures can be used by multiple users and client programs. If you utilize them in a planned manner, you'll find the development cycle takes less time.
• Enhanced security controls. You can grant users permission to execute a stored procedure independently of underlying table permissions.
Stored procedures are very similar to user-defined functions, but there are subtle differences. For more information, read Comparing Stored Procedures and User-Defined Functions.
Structure
Stored procedures are extremely similar to the constructs seen in other programming languages. They accept data in the form of input parameters that are specified at execution time. These input parameters (if implemented) are utilized in the execution of a series of statements that produce some result. This result is returned to the calling environment through the use of a recordset, output parameters and a return code. That may sound like a mouthful, but you'll find that stored procedures are actually quite simple.
Example: SELECT Product, Quantity
FROM Inventory
WHERE Warehouse = 'FL'
CREATE PROCEDURE sp_GetInventory
@location varchar(10)
AS
SELECT Product, Quantity
FROM Inventory
WHERE Warehouse = @location
Our Florida warehouse manager can then access inventory levels by issuing the command
EXECUTE sp_GetInventory 'FL'
The New York warehouse manager can use the same stored procedure to access that area's inventory.
EXECUTE sp_GetInventory 'NY'
How to execute Store Procedure
Exec Storeprocedurename @parm1,@parm2,@parm3
Points to keep in mind when Store Procedure
Some of the points mentioned in this article is getting through the following site. I added them for my future reference of getting all information at one point.
http://www.devx.com/getHelpOn/10MinuteSolution/20057/1954
Spell out your Procedure’s Purpose
A stored procedure works best when doing only one thing. However, some shops are under pressure to localize the logic underlying data modifications. This centralizes code maintenance by putting the logic in one procedure instead of three or four (e.g., procedures for insert, update, and delete). Unfortunately, combining these actions makes the underlying procedure exponentially more complicated. Instead of doing only one thing, the procedure is pressed into general service, doing three or four things. Your first line of defense in debugging this type of general-purpose stored procedure is a good header that enables you to document its purpose(s). Once the header is set up and maintained, you have a running history of what is happening in the code.
Header contains details like Author, Date, Last Rev, Revision History, Purpose, Creates, uses, parameters in, parameters out , Relations, Good Joins, Parking Lot
Determine the Flow of Control
By branching based on the parameters sent to the procedure, you delegate control to the relevant sections to do the DML required. Assume you have a procedure that has one parameter: @Action, which could be an Insert, Update, or Delete parameter. To test the value of @Action and perform accordingly, you need a structure something like this:
IF @Action = XXXXXX
BEGIN
Your code here
END
You should get this fleshed out by pseudo-coding your intention first and then creating the logic to accomplish what you intend. Since most errors are logic errors, getting this down in stub form will assure that the code is at least executing the flow of control correctly.
Basic things needs to check while Testing Store Procedure
* Naming convention of Store procedure
* Whether SP is part of DB or not
* Name of parameter, No of parameters, type of parameters
* Check for the Outputs
o When output is zero (zero row affected)
o When some records are extracted
o Output contains many records
o What a stored procedure is supposed to do and not supposed to do
o Write a simple queries to see if a stored procedure populates right data
* Check for the input parameters
* Check for the valid data for each parameter
* Check for the invalid data for each parameter
* Check for the boundary value of data
* Check for all the required parameters
* Check parameter for data type
* Check for the Return values
* Check if SP return any values
* Check if failure occurs then it should return nonzero
* Check for the Error Messages
o Make stored procedure fail and cause every error message to occur at least once
o Find out any exception that doesn’t have a predefined error message
* Check for other values
* Whether a stored procedure grants correct access privilege to a group/user
* See if a stored procedure hits any trigger error, index error, and rule error
* Look into a procedure code and make sure major branches are test covered.
* Integration check for the procedure
* Group related stored procedures together. Call them in particular order
* If there are many sequences to call a group of procedures, find out equivalent classes and run tests to cover every class.
* Make invalid calling sequence and run a group of stored procedures.
* Design several test sequences in which end users are likely to do business and do stress tests
Let the System Do the Work with everything
By creating a parameter mill, you can feed a stored procedure a collection of parameters that represent the universe of things it can accept, enabling you to test the scope of conditions anticipated by your code. You can automate the boundary conditions with a table and procedure to do the drudgery for you.
CREATE PROCEDURE dbo.usp_TestFacilityRoster
@InAction VARCHAR(15),
@EntryMonth INT,
@EntryYear I
@FileType CHAR(1)
AS
IF @InAction ='NEWREINS' or @InAction = 'NEWREINSSNP'
BEGIN
--do stuff for new members
END
IF @InAction = 'DISENROL' OR @InAction = 'EXISTING'
BEGIN
--do common stuff for existing or disenrolled members
IF @InAction = 'DISENROL'
BEGIN
--specific for disenroll
ELSE
IF @InAction = 'EXISTING'
BEGIN
--specific for existing
END
Create a test plan table with the following code
create table p1 (param1 char(8))
create table p2 (param2 int)
create table p3 (param3 int)
create table p4 (param4 char(1))
create table testplan (param1 char(8), param2 int, param3 int, param4 char(1))
insert p1 values('NEWREINS')
insert p1 values('DISENROL')
insert p1 values('EXISTING')
insert p2 values(1)
insert p2 values(2)
insert p2 values(3)
insert p2 values(4)
insert p2 values(5)
insert p2 values(6)
insert p2 values(7)
insert p2 values(8)
insert p2 values(9)
insert p2 values(10)
insert p2 values(11)
insert p2 values(12)
insert p3 values(1999)
insert p3 values(2000)
insert p3 values(2001)
insert p3 values(2002)
insert p3 values(2003)
insert p3 values(2004)
insert p4 values('A')
insert p4 values('S')
insert p4 values('P')
insert testplan select * from p1,p2,p3,p4
This will create a Cartesian Product (cross join) of all the combinations of the parameters your procedure can accept.
During development, you try to mimic the call to your procedure by writing something like this near the top of the code:
/* TEST BLOCK
set @InAction ='NEWREINS'
set @entrymonth = 7
set @entryyear = 2003
*/
Instead, you can create a string to execute the stored procedure that leaves placeholders for your procedure's parameters:
select 'Execute dbo.usp_TestFacilityRoster('+
''''+
param1 + ''''+','
+cast(param2 as char(2))
+','+
+cast(param3 as char(4))
+','
+''''+param4+''''
+')'
from testplan
This code doesn't execute it though. You need to use dynamic SQL to execute the string. So the above code can be wrapped in a cursor like this:
set nocount on
declare @param1 varchar(25)
,@param2 varchar(2)
,@param3 varchar(4)
,@param4 varchar(25)
declare C1 Cursor for
select param1, param2, param3, param4
from testplan (nolock)
open C1
fetch next from C1
into @param1, @param2, @param3, @param4
while @@fetch_status = 0
begin
exec dbo.usp_TestFacilityRoster, @param1, @param2, @param3,
@param4
fetch next from C1
into @param1, @param2, @param3, @param4
end
close C1
deallocate C1
-- this statement builds the string from the table
Update Testplan
Set @sqlstring2 = @sqlstring2 + cast(param1 as char(8)) + ', '
+cast(param2 as char(2)) + ', ' + cast(param3 as char(4))+ ', '
+cast(param4 as char(1))
print @sqlstring2
set @sqlstring = @sqlstring + @sqlstring2
execute sp_executesql @sqlstring
You also can use this approach from the command line and imitate the testing procedure outlined above by using the FOR IN DO construct in a batch file, or in better flow of control language from WSH so that you can make repeated calls to your procedure with the different parameter sets. However, the approach given here lets you work from within Query Analyzer without having to resort to the command window or another language to get the job done.
These ideas were necessary for me to circumvent the absence of permissions to run the SQL Debugger or to install my own error messages so that I could use 'raise error' in my assignment. This Solution has provided a few ideas for getting around similar limitations in your own environment, and they lets you test your code automatically to have a higher degree of confidence in the correctness of your own solutions
Posted by Trupti Jethva at 10:40 PM 0 comments
Labels: Database Testing
Thursday, December 9, 2010
Concept of JOIN
Concept of JOIN:
The JOIN keyword is used in an SQL statement to query data from two or more tables, based on a relationship between certain columns in these tables.
INNER JOIN
This join returns rows when there is at least one match in both the tables.
Example:
SELECT * FROM Table1 T1 INNER JOIN Table2 T2 ON T1.col1=T2.col1
OUTER JOIN
There are three different Outer Join methods.
LEFT OUTER JOIN
This join returns all the rows from the left table in conjunction with the matching rows from the right table. If there are no columns matching in the right table, it returns NULL values.
Example:
SELECT * FROM Table1 T1 LEFT OUTER JOIN Table2 T2 ON T1.col1=T2.col1
RIGHT OUTER JOIN
This join returns all the rows from the right table in conjunction with the matching rows from the left table. If there are no columns matching in the left table, it returns NULL values.
Example:
SELECT * FROM Table1 T1 RIGHT OUTER JOIN Table2 T2 ON T1.col1=T2.col1
FULL OUTER JOIN
This join combines left outer join and right after join. It returns row from either table when the conditions are met and returns null value when there is no match.
Example:
SELECT * FROM Table1 T1 FULL OUTER JOIN Table2 T2 ON T1.col1=T2.col1
CROSS JOIN
This join is a Cartesian join that does not necessitate any condition to join. The resultset contains records that are multiplication of record number from both the tables.
Below is good reference link for concept of JOIN.
http://blog.sqlauthority.com/2009/04/13/sql-server-introduction-to-joins-basic-of-joins/
Posted by Trupti Jethva at 5:23 PM 0 comments
Labels: Database Testing
Monday, August 2, 2010
Database Queries
Some useful Queries
1. Query for find n th highest salary
SELECT TOP 1 salary
FROM (
SELECT DISTINCT TOP 3 salary
FROM dbo.empsalary
ORDER BY salary DESC) a
ORDER BY salary
2. To find the size of database
SELECT * FROM SYSFILE
USE URDBNAME
EXEC SP_HELPDB
EXEC SP_SPACEUSED
3. To generate test data for generating load on database with create tables and insert record in it
Creation of employee table
USE [tempdb]
GO
/****** Object: Table [dbo].[Emp] Script Date: 03/06/2009 17:09:08 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
SET ANSI_PADDING ON
GO
CREATE TABLE [dbo].[Emp](
[no] [numeric](18, 0) NOT NULL,
[Temp1] [varchar](50) NOT NULL,
[Temp2] [varchar](50) NOT NULL,
[Temp3] [varchar](50) NOT NULL,
[Temp4] [varchar](50) NOT NULL,
[Temp5] [varchar](50) NOT NULL,
[Temp6] [varchar](50) NOT NULL,
[number] [numeric](18, 0) NOT NULL,
[date] [datetime] NOT NULL,
CONSTRAINT [PK_Emp] PRIMARY KEY CLUSTERED
(
[no] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO
SET ANSI_PADDING OFF
GO
Creation of Employee child table
USE [Test]
GO
/****** Object: Table [dbo].[EmpChild] Script Date: 03/06/2009 17:12:34 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
SET ANSI_PADDING ON
GO
CREATE TABLE [dbo].[EmpChild](
[No] [numeric](18, 0) NOT NULL,
[Temp1] [varchar](50) NOT NULL,
[Temp2] [varchar](50) NOT NULL,
[date] [datetime] NOT NULL,
CONSTRAINT [PK_EmpChild] PRIMARY KEY CLUSTERED
(
[No] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO
SET ANSI_PADDING OFF
GO
ALTER TABLE [dbo].[EmpChild] WITH CHECK ADD CONSTRAINT [FK_EmpChild_Emp] FOREIGN KEY([No])
REFERENCES [dbo].[Emp] ([no])
GO
ALTER TABLE [dbo].[EmpChild] CHECK CONSTRAINT [FK_EmpChild_Emp]
GO
Insert Query for 50000 records (n could be any number)
-- Driver:
DECLARE
@i INTEGER,
@i1 INTEGER
BEGIN
SET @i = 0
SET @i1 = 1
WHILE @i1 <= 50
BEGIN
INSERT INTO emp
VALUES (@i1, 'Temp1', 'Temp2', 'Temp3', 'Temp4', 'Temp5', 'Temp6',@i1,getdate())
SET @i1 = @i1 + 1
END
END
GO
-- Child
DECLARE
@i1 INTEGER
BEGIN
SET @i1 = 1
WHILE @i1 <= 50
BEGIN
INSERT INTO empchild
VALUES (@i1, 'Temp1', 'Temp2', getdate())
SET @i1 = @i1 + 1
END
END
GO
select count(*) from emp;
select count(*) from empchild;
select * from emp;
select * from empchild;
Posted by Trupti Jethva at 5:11 PM 0 comments
Labels: Database Testing
Wednesday, April 21, 2010
Data Migration Testing concept and checklist for testing
Data Migration Testing
Data migration is a process which involves the migration of data from an existing database to a new database. Whenever an organization decides to upgrade or change its database, it will need to transport the existing data to the new database.
The scope of data migration is much more than what the term suggests. Data migration activity typically will include everything that is required with respect to data to ensure that the new database is up and running without any defects and it also contains all the legacy data ( data that is present in the existing database) , and the data has been migrated to correct table(s) and column(s).
Consequently it involves more activities than only migration of existing data as the name suggests.
In a typical scenario it can include following activities:-
A. Migration of existing data to the new database.
B. The new database can have some 'Master Tables'. And Master tables need to be populated with data first to account for various referential constraints. In such cases data migration includes the activity of setting up (master) data in master tables.
C. The new database can have some tables or attributes which are not present in the existing database. Thus a completely new set of data will be setup in these tables.
While data migration is different, first understand the data migration process. Data migration is the process of shifting the whole production data from one database to other new database. This process might be done by using data migration tool which is developed as per the data migration requirements.
During this data migration process there may be possibility of data missing or data corruption. So after the data migration process one has to test whether the whole data is migrated successfully or not, whether something is missing or gets currepted during migration.
So data migration testing is to compare migrated data with original data to find out any discrepancies.
When we are speaking database migration testing. We can say ETL testing. Database migration is extracting the database from source and loads the data to target database.
Two types of Loading:
1. Direct Loading.
2. Incremental Loading.
Direct Loading:
The direct loading is nothing copy the data from source to target. The number of rows should be equal source and target.
Incremental Loading:
The Incremental loading, when loading the data from source to target, applying some transformations. Cleaning the data or load the data with specific conditions.
Challenge is two different databases have their own structures so just simple table/record import export function will not work. We need a tool (say ETL) in between source database and destination database where we will write the transformation logic, done by the Dev team. Testing team will validate that all those business rules in ETL are actually working in migration testing
Types of Migration
Migration of data from one database to another can be further classified into various subsets.
a. Direct Migration from the source database to the target:
In this type of migration the data elements that are present in the source database is directly transferred into the target database
b. Migration after performing certain transformations on the source data:
This migration requires some transformation of the source data before it is migrated to the new database.
For e.g., a field ‘Priority’ in the source database may allow only one out of the three values High, Medium and Low. But in the target database the same field allows one out of the three values 1, 2 and 3. Thus the a High value in the source database will be transformed to a value 1 and will then be migrated to the target database.
Different types of migrations are listed below. These are the migrations which were encountered during testing of data migration project(s). There could be some more types of migration and presently are not part of the below list.
1. Change in datatype.
a. Data in the source table may be present as a character, but in the target table the mapped column is an integer.
2. Concatenation of two or more columns.
3. Mathematical transformations (e.g. summing up of fields etc)
4. Migration of values depending on other fields/tables present in the source database.
c. Target Generated Data:
This is the data that is generated at the target database. These values will not be present the source database but will be generated each time an insertion or an update takes place on a table. These columns generally are the primary key of a table.
d. Reference at target system generated.
This type of migration deals with data that has to comply with referential integrity constraints. This data is generated at the new database but the data comes from the parent attribute. A reference key in the target database is usually encountered when a table in the source database is split up into two or more tables in the target.
e. Manual setup data in the target database.
Certain tables have to be populated before the migration takes place. These tables can be termed as ‘Master Tables’. These tables are the parent tables to the various referential constraints that are to be implemented in the child tables. The data for these tables may not come from the source database but will be setup manually in the target database.
Checklist for test Data Migration Testing
1. We can check for the table row count.
2. We have to check database integrity.
3. We have to check the referential integrity.
4. Whether the data are loaded in the target database correctly.
5. Check the constraints.
6. All the entities are transformed or not like tables, SPs, Views, functions etc
7. Check the functionality of your application with target database, it should work as it is supposed to work
Data Migration Testing Life Cycle
The following are the steps to be followed in the process of Data Migration Testing:-
1. Data Migration Requirement Analysis
2. Test Strategizing
3. Creation of POC
4. Test Planning
5. Design of Automated Test Scripts
6. Test Environment Setup
7. Test Execution
8. Result and Defect Analysis
Useful links:
This comment has been removed by the author.
ReplyDelete
ReplyDeleteThis is the best blog design and article. Very nice job. Thank you very much
Ms sql training in noida
Australia Best Tutor offer different types of services at affordable price.
ReplyDeletelive Chat @ https://www.australiabesttutor.com/assignment-help
Read More @
Finance Assignment Help Tasmania
Statistics Assignment Help Tasmania
Nursing Assigment Help Tasmania
Dissertation Writing Services
Assignment Help Tasmania
Artificial Intelligence Training institutes in Noida ,
ReplyDeletebest php training institute in noida ,
mvc training institute in noida ,
angularjs training institute in noida ,
Android apps Training Institute In Noida
bigbasket coupon codes
ReplyDeletebigbasket offers
bigbasket coupon and deals
pizzahut promo code
pizzahut coupon codes
pizzahut offers
pizzahut coupon and offers
hotels promo code
hotels coupon codes
hotel offers & deals
hotels discount offers
nearbuy coupon codes
nearbuy promo codes
nearbuy deals and offers
nearbuy discounts
zoomcar promo code
zoomcar coupon code
zoomcar offers on ride
zoomcar deals and offers
Myntra promo codes
ReplyDeleteMyntra deals and offers
Myntra coupon codes
Myntra coupons offers promo codes
Myntra offers on online shopping
Nykaa Promo Codes
Nykaa Deals and offers
Nykaa Coupons codes
Nykaa coupons offers promo codes
Nykaa offers on online shopping
Live Web Experts is a remarkable academic portal that is known for offering best regulation on Accounting Homework Topic to the scholars at a meagre price.
ReplyDeleteLive Chat @ https://www.livewebexperts.com/services
Read More @ Assignment Help Florida
Online Tutoring Help California
Economics assignment answers NSW – Help with Assignments is a well-known academic portal that offers high-end economics answer help to the universities and college students. Live Chat @ https://www.helpwithassignments.com/management-assignment-help
ReplyDeleteRead More @ Help with Homework Australia
Algebra Assignment Solver
Nursing Homework Assignment Victoria
Nursing Assignment Writer Kansas – The Best Tutors is a reliable academic portal that is known to offer exceptional educational assistance to the nursing assignment services students at a small price. Live Chat @ https://www.besttutors.us/medical-assignment-help
ReplyDeleteRead More @ Nursing Assignment Writer Kansas
Software Engineering Homework Help Alaska
Amellie Taylor is a renowned academician writing on a series of academic topics. He is associated with the UK Best Tutor, a reliable academic portal. Amellie Taylor is the title of today’s content.
ReplyDeleteLive Chat @ https://www.ukbesttutor.co.uk/our-services/dissertation-help-services
Read More @ Online Homework Help Bristol
Best Assignment Help Bristol
Online Essay Writing Bristol
ReplyDeleteManagement Tutors is one of the Assignment Help UK that has been successfully rendering amazing academic assistance to the students.
Live Chat @ https://www.managementtutors.com/
Read More @ Assignment Help Uk
Assignment Help Malaysia
Management Assignment Help
SOA WCF Development Services in United States
ReplyDeleteWeb API Development Services in United States
Web, Mobile App Maintenance Support Services in United States
Manual Testing, Automation Testing Services in United States
Software Migration Re Engineering Services In United States
Node js Open Source Development Services in United States
Symfony Web Development Services in United States
ReplyDeleteOpen Source YII Framework Development Services in United States
Microsoft .Net Web Development Services in United States
Offshore Salesforce cloud software solutions Development Services in United States
Microsoft Dynamic Navision Development Services in United States
AngularJS JavaScript-based open-source front-end web application Services in United States
Custom Software Development Services in United States
ReplyDeleteCustom Web Application Development Services in United States
CMS Website Development Services in United States
Enterprise Application Integration Services in United States
Offshore SharePoint Development Services in United States
Open Source Web Development Services in United States
Cost-Effective Website Redesign And Development Services in United States
The expert Assignment Help Tutor England with the portals prepare a rough draft initially for the clients to consider. At this stage, modifications can be made if required. Once everything has been found to be good, the expert writes the final content.
ReplyDeleteLive Chat @ https://www.ukbesttutor.co.uk/
Read More @ Civil Engineering Assignment Help London
Best Assignment Help Bristol
Physics Assignment Help England
Australia Best Tutor is available providing Help with Assignment Services Melbourne that help the students in preparing a relevant and innovative assignment paper. These experts are highly efficient and well trained.
ReplyDeleteLive Chat @ https://www.australiabesttutor.com
Read More About
Assignment Help Melbourne
Help with Assignment Melbourne
Mathematics Assignment Help Melbourne
Engineering Assignment Help Melbourne
My Homework Help Services Australian Territory – Help with Assignments is a reliable academic company offering high-end direction and help to the students pursuing homework help at a very reasonable price.
ReplyDeleteLive Chat @ https://www.helpwithassignments.com/homework-help-services
Read More @ Economics Help Online Victoria
Management Assignment Help Victoria
Nursing Assignment Writer Arkansas – The Best Tutors is a best academic portal that is known for success help in completing assignment services to the students. Live Chat @ https://www.besttutors.us/medical-assignment-help
ReplyDeleteRead More @ Forensic Accounting Homework Help Alaska
Pre- Calculus Homework Help Alabama
My Genius Mind is a reputed academic concern that extends best professional Assignment Help to the students. This service can be availed at any time anywhere.
ReplyDeleteLive Chat @ https://www.mygeniusmind.com/my-assignment-help
Read More Information @ Accounting Assignment Help Brisbane
Boston South Australia Assignment Help
Mandurah Western Australia Assignment Help
NewCastle NSW Assignment Help
Management Tutors, known for offering fantastic coursework help to the students. It is beneficial to the students in the achievement of their goals and getting good grades by the teachers.
ReplyDeleteLive Chat @ https://www.managementtutors.com/professional-help-with-assignment-uk
Read More @ Dissertation Help Services Adelaide
Business Management Help Melbourne
Project Management Help Queensland
Operational Management Assignment Brisbane
Help with Assignments is best My Assignment custom firms make sure to send unique content. They assume severe research and provide sincere work permanently.
ReplyDeleteLive Chat @ https://www.helpwithassignments.com/
Read More @ Business Finance Assignment Help Victoria
University Nursing Assignment Help Queensland
Homework Help Services Victoria
The Best Tutors is a remarkable academic portal that is known for offering online dissertation services responses to the subtle students.Live Chat @ Essay Writing Services Arizona
ReplyDeleteRead More @ Nursing Assignment Writer Arkansas
Assignment Tutoring Help Alabama
Australia Best Tutor is offering online assignment help services Australia at affordable price. Here students are joining for best academics grades and good quality content.Services are under below
ReplyDeleteEngineering Assignment Help Brisbane
Management Assignment Help Brisbane
Assignment Writing Services Brisbane
Nursing Assignment Help Perth
Finance Assignment Help Perth
Live Chat @ https://www.australiabesttutor.com
The Live Web Experts is a popular academic portal that has made a name for itself with its Geometry assignment Help to the students at the very effective rate.
ReplyDeleteLive Chat @ https://www.livewebexperts.com/homework-help/maths-assignment-help
Read More @ Geometry Assignment Help Michigan
Accounting Homework Topics Chicago
Online Probability Homework Illinois
UK Best Tutor is one of the famous and trustworthy academic portals that is known to provide the students with Assignment Help Scotland directly and successfully.
ReplyDeleteLive Chat @ https://www.ukbesttutor.co.uk/
Read More @ Online Assignment Writing UK
University Assignment Help UK
Assignment Help Tutor England
Help with Assignments and offers impressive Help Engineering to the students in various subjects. When such tasks are submitted, the students are bound to get high marks or scores.
ReplyDeleteLive Chat @ https://www.helpwithassignments.com/engineering-assignment-help
Read More @ Marketing Management Assignment Help Queensland
Help with Accounting Assignment Help Australian Territory
Pay for Finance Assignment Help Queensland
Help with Assignments is a reputed academic company that has been known to impart best engineering jobs to the students and help them explain their doubts.Live Chat @ https://www.helpwithassignments.com/engineering-assignment-help
ReplyDeleteRead More @ Online Accounting Assignment Help South Australia
My Assignment Help Services South Australia
Human Resource Management Assignment Help South Australia
Online Help with my Assignment South Australia
The Best Tutors is a reliable and accessible academic portal that is known for offering exclusively detailed Accounting Homework help to the students.Live Chat @ https://www.besttutors.us/accounting-homework-help
ReplyDeleteRead More @ Electrical Engineering Homework Help Colorado
Thesis Writing Services Connecticut
Thesis Writing Services Arkansas
Live Web Experts is a favourite educational website that has gained a lot of reputation for submission eccentric Homework Help Online to the students.
ReplyDeleteLive Chat @ https://www.livewebexperts.com/homework-help
Read More @ Finance Textbook Assignment Illinois
Assignment Help Experts Alaska
Help Me Write A Paper Taxes
Best Tutors, The students can obtain help from the professional and expert authorities with abundant knowledge, which is given by us as a reliable and popular engineering homework help earner.Live Chat @ https://www.besttutors.us/engineering-homework-help
ReplyDeleteRead More @ Geometry Homework Help Colorado
Finance Homework Help Colorado
Thank you for your post. This is excellent information. It is amazing and wonderful to visit your site
ReplyDeletesamsung Galaxy Note 9 smartphone
Honor 8X smartphone
Huawei Mate 20 X smartphone
.OnePlus C type Earphones
Apple iPhone XR specification
Xiaomi Mi 8 Youth Edition smartphone
samsung Galaxy Note 9 Price
Honor 8X Price
Huawei Mate 20 X Price
.OnePlus C type Price
Apple iPhone XR Price
Xiaomi Mi 8 Youth Editions Price
samsung Galaxy Note 9 Specification
Honor 8X Specification
Huawei Mate 20 X Specification
.OnePlus C type Specification
Apple iPhone XR Specification
Xiaomi Mi 8 Youth Editions Specification
samsung Galaxy Note 9 Feature
Honor 8X Feature
Huawei Mate 20 X Feature
.OnePlus C type Feature
Apple iPhone XR Feature
Xiaomi Mi 8 Youth Editions Feature
Thank you for your post. This is excellent information. It is amazing and wonderful to visit your site
ReplyDeletesarkari naukri
Latest Govt Jobs
सरकारी नौकरी
सरकारी रिजल्ट
sarkari update
sarkari result 2019
baba result 2019
Army vacancy
sarkari exam
freejobalert
Railway vacancy 2018
Very Helpful and informative blog! Keep sharing such blogsSoftware Development Company in India
ReplyDeletenational institute of design entrance exam
ReplyDeletenational institute of design admissions
Nift Classes In Bangalore
uceed courses after 12th
CEED admission form
CEED Admission
ReplyDeletefashion design studios in bangalore
Ceed Entrance Form
nid entrance
nift institutes in india
Nid Admission Test Papers
how to get admission in nift
nift online application
ReplyDeleteNift Coaching Classes In Bengaluru
nift study material free download
nift entrance form
ceed application form
NID admission form
nift institutes
nid form
NID Admission
UCEED
nid admission criteria
ReplyDeletenift situation test result
Ceed Application Form Fees
NIFT Admission
nift entrance exam form
how to take admission in nift
Ceed 2019 Books
UCEED Sample Papers
UCEED Admission
colleges under nift
nid ahmedabad courses
ceed online application
ReplyDeletenid study material pdf
nid online application
nid college
nift college mumbai
nift kolkata admission
nift entrance
nid courses list
ceed courses
Nift Bombay
nid entrance exam date
nift situation test registration
nift entrance exam
ReplyDeleteCeed College List
nift entrance result
nift entrance preparation
nid interiors design courses
nift entrance test
nift post graduation fees
Nift Bangalore Contact
nift online exam
nid information
nift undergraduate courses
nift entrance exam sample papers
NIFT Results
ceed exam pattern
ReplyDeleteonline entrance coaching
institute of fashion design in ahmedabad
National Institute of Design
Nift Tutorials
This is very nice and amazing content and through this, we can get important information and increase our general knowledge and abilities. We should share good and positive content, so that other readers can get benefits. Dissertation writing service.
ReplyDelete