Pages

Search This Blog

Showing posts with label SQL Server. Show all posts
Showing posts with label SQL Server. Show all posts

Monday, February 6, 2012

Powershell script to retrieve a deployed wsp from configuration database : SharePoint 2010

Many a times, we find ourselves in a situation wherein we want to find out what are the contents of a deployed wsp so as to debug issues, compare releases etc.

This is not such a trivial task as the sharepoint central administration pages do not provide us with any option to retrieve a deployed wsp from the sharepoint configuration database.

This task can be done through a powershell script. The powershell script given below takes 2 parameters:

- The display name of the solution to retrieve (.wsp file)
- A path on the local machine where the retrieved wsp will be saved.

If no name of the solution is given , it displays the list of deployed solutions to the user and lets the user choose the solution to retrieve. Many more such scripts can be seen at the source project for this script available at http://sharepointpsscripts.codeplex.com/


param ([string] $name, [string] $localpath)
#Load the required SharePoint assemblies containing the classes used in the script
#The Out-Null cmdlet instructs the interpreter to not output anything to the interactive shell
#Otherwise information about each assembly being loaded would be displayed
[System.Reflection.Assembly]::Load("Microsoft.SharePoint, Version=12.0.0.0 , Culture=Neutral, PublicKeyToken=71e9bce111e9429c") | Out-Null

function Get-SolutionName()
{
    # Initialize an empty hashtable to store solution names and indexes
    $solHash = @{};
   
    # Bind to the collections of all solutions in the local farm and process them one by one, storing names in a hashtable
    # under automatically incremented indexes
    # The Foreach-Object cmdlet uses the begin/process/end structure to initialize the $i index counter
    ([Microsoft.SharePoint.Administration.SPFarm]::Local).Solutions |
        Foreach-Object {$i=1;} {$solHash.$i = $_.displayname; $i++} { }
   
    Write-Host;
   
    # If solutions were found, present the user with selection
    if ($solHash.Count -gt 0)
    {
        Write-Host -Object "The following solutions were found in the farm:" -ForegroundColor Green -BackgroundColor DarkMagenta;
        Write-Host;
       
        # Hashtables are not sortable, so in order to sort solutions by index keys have to be sorted separately first
        $solHash.Keys | Sort-Object | Foreach-Object {Write-Host -Object $("`t[{0}] {1}" -f $_, $solHash[$_]) -ForegroundColor Yellow -BackgroundColor DarkMagenta;}
        Write-Host;
        Write-Host -Object "Enter the index number of the solution you wish to retrieve, or 0 (zero) to exit: " -NoNewLine -ForegroundColor Green -BackgroundColor DarkMagenta;
       
        # Obtain input from user and return the matching value from the hashtable
        return $solHash[[int](Read-Host)];
    }
   
    # No solutions found in the configuration database
    else
    {
        Write-Host -Object "The local farm's solution store contains no solutions." -ForegroundColor Green -BackgroundColor DarkMagenta;
        Write-Host;
    }
}
# Check if name of target solution was specified as a parameter
if (-not $name)
{
    # Name of solution was not specifed, so call the Get-SolutionName function to obtain the name
    $name = Get-SolutionName;
   
    # If after calling the function the name is still unknown, stop execution
    if (-not $name)
    { break; }
}
# Check if local path to save the file to was specified as a parameter
if (-not $localpath)
{
    Write-Host;
    # Prompt and obtain local path value from user
    Write-Host -Object "Enter the local path to the folder you want the solution file to be saved to: " -ForegroundColor Green -BackgroundColor DarkMagenta;
    $localpath = Read-Host;
}
# Check if the path specified is valid and throw an exception if it's not
if (-not $(Test-Path -Path $localpath -PathType Container))
{
    throw "`"$localpath`" is not a valid path! If the path contains spaces, it must be enclosed in SINGLE quotes."
}
# Try to bind to the target solution
$solution = ([Microsoft.SharePoint.Administration.SPFarm]::Local).Solutions | Where-Object {$_.Name -eq $name}
Write-Host;
# Check if solution was found
if ($solution -ne $null)
{
    # Constitute the full local path (including file name)
    $solPath = Join-Path -Path $localpath -ChildPath $solution.SolutionFile.Name;
   
    # Try and save solution file locally
 $solution.SolutionFile.SaveAs($solPath);
   
    # If no errors occurred, display a success message
    if ($?)
    {
    Write-Host "Solution file saved successfully to $solPath" -ForegroundColor Green -BackgroundColor DarkMagenta;
       Write-Host;
    }
}
# Solution not found in the store; display a warning message
else
{
 Write-Host -Object "Solution `"$name`" could not be found!" -ForegroundColor Red -BackgroundColor DarkMagenta;
    Write-Host;
}

Monday, January 23, 2012

Cannot connect to database master at SQL during Installation of Sharepoint 2010

Whenever you try to Install Sharepoint 2010 on your server after running the Pre-requisits installer, you might face the issue where you receive the error message as below:

"Cannot connect to database master at SQL server at SERVERNAME. The database might not exist, or the current user does not have permission to connect to it."

This issue usually comes when you try to add database instance name in the Sharepoint 2010 Installation WIzard (when the SQL server is deployed on the server other than the one in which you are installing Sharepoint).

You can follow below steps to resolve this issue.

1) Navigate to Microsoft SQL Server 2008 and open the "SQL Server Configuration Manager".

2) Under the "SQL Server Configuration Manager", select the section "Protocols for MSSQLSERVER" under SQL Server Network Configuration.

3) Ensure that the protocol named TCP/IP is enabled, if its not enable it.

4) Ensure that you restart SQL Server after doing this change from the "SQL Server Services".


Wednesday, January 11, 2012

SharePoint 2010 Database Server Memory Requirements



The memory required for SharePoint Server 2010 is directly related to the size of the content databases that you are hosting on a server that is running SQL Server.

As you add service applications and features, your requirements are likely to increase. The following table gives guidelines for the amount of memory we recommend.



Combined size of content databases
RAM recommended

Minimum for small production deployments

8 GB

Minimum for medium production deployments

16 GB

Recommendation for up to 2 terabytes

32 GB

Recommendation for the range of 2 terabytes to a maximum of 5 terabytes

64 GB


Microsoft definitions of small and medium deployments are those described in the "Reference Architectures" section of the article Capacity management and sizing for SharePoint Server 2010.
http://technet.microsoft.com/en-us/library/cc261700.aspx


These values are higher than those recommended as the minimum values for SQL Server because of the distribution of data required for a SharePoint Server 2010 environment. For more information about SQL Server system requirements, see Hardware and Software Requirements for Installing SQL Server 2008 (http://go.microsoft.com/fwlink/?LinkId=129377).

Other factors that may influence the memory required include the following:

  • The use of SQL Server mirroring.
  • The frequent use of files larger than 15 megabytes (MB).
For more information please download the whitepaper from Microsoft site :

Tuesday, January 3, 2012

Use SQL Server Alias to Create SharePoint Farm


SharePoint depends on its connections to one or more SQL Server instances for the overwhelming majority of its content and configuration data. 
The loss of a SQL Server can spell disaster for SharePoint, particularly if the lost SQL Server can’t be restored and a new SQL Server instance (with a new name) must be used. The use of SQL Server names and/or addresses in difficult-to-alter SharePoint database connection strings is what makes this situation problematic.
You can get some protection and peace-of-mind for this type of scenario by utilizing SQL Server aliases on each of your SharePoint servers. Aliases act as an abstraction layer between your SharePoint servers and your SQL servers. So, instead of this:

Instead of SharePoint => SQL Server you end up with this: SharePoint => Alias => SQL Server
In this scenario, pointing your SharePoint member servers to a different back-end SQL Server is as simple as changing one or more SQL aliases. In effect, your SharePoint farm becomes decoupled from the SQL Server(s) it uses.
SQL Server aliases can be created and maintained easily using the SQL Client Network Utility (cliconfg.exe) or the SQL Server Configuration Manager which is installed with the SQL Client Tools.

How to create SQL Server Alias

  1. In SQL Server Configuration Manager, expand SQL Native Client Configuration, right-click Aliases, and then click New Alias.
  2. Press F1, or click Help.
  3. The New Alias (Alias Tab) topic describes the New Alias dialog box, and contains links to valid connection strings for each type of protocol.

Tuesday, November 15, 2011

SharePoint SSRS Integration in Local Mode

If you have understanding that SharePoint SSRS integration can work only with SQL Reporting services than it's not the case.Although, In the previous versions of Reporting Services, this was the only mode available and at SharePoint had to be connected to a Reporting Services report server configured in SharePoint Integrated mode in order to render reports using the Report Viewer.

With the release of SQL Server 2008 R2, Reports can be render without integrating SharePoint farm with a Reporting Services report server. Instead, Report Viewer can use to directly render reports from SharePoint when the data extension supports local mode reporting. But in this mode, SharePoint List and the Microsoft Access 2010 reporting can be used as data source.

When it's integrated in local mode, report can be render that has an embedded data source or a shared data source from an .rsds file. However, can’t manage the report or its associated data source as this is not supported in local mode.

Ref : http://technet.microsoft.com/en-us/magazine/ff686706.aspx

Monday, March 7, 2011

How to improve Sharepoint Performance


Regardless of the framework and methodology you use in trying to understand your SharePoint performance issues, isolate their root causes, and resolve the problems, you need baseline data that reflects acceptable performance against which you compare data that reflects poor performance. You can obtain a baseline set of performance data either right after your initial deployment, during periods of acceptable performance, or from best-practice recommendations. If you have no basis for comparison, then you can use Microsoft published performance recommendations for various measurements, available at 

http://technet.microsoft.com/en-us/library/cc262787.aspx


Issue
Possible Root Causes
Possible Resolutions
High disk I/O activity on SQL Server
Large list operations, timer jobs, SQL maintenance tasks, backup, indexing, inadequate RAM, high I/O databases such as temp tlog, search, and content, placed on same disk or slow disks.
Separate temp and search databases in multiple files across high I/O disk volumes, increase RAM, use dedicated disks for transaction logs, defragment, and re-index databases weekly.
SQL blocking/locking
NIC configuration, large list operations, indexing/crawling jobs.
Do not use SharePoint Team Services Administration (STSADM). Use SQL backup, DPM, Litespeed, or SQL 2008 with compression, ensure fill factor is set to 70% on content databases, enforce 100GB growth limit.
Overall slow page loads
Compression not enabled. Caching not enabled or not configured. Large pages. Redundant SQL trips, underlying network issues.
Enable caching and compression, check page load times, and examine SQL queries and round trips, check NIC for Broadcom 5708 Chimney issues.
Long time to load full page
Improper SharePoint object handling in custom code, slow links, SQL blocking, timer jobs, Web part caching not enabled.
Resolve back-end bandwidth and response issues, dispose of objects properly, use 64-bit hardware or configure memory pool limits, delay downloading core.js.
Poor list performance
Large lists >2,000–3,000 items in a level. No indexing on lists. Underlying SQL Server issues. Too many columns.
Index on one or more columns, ensure SQL Server performance, keep fewer than 2,000–3,000 items in a level.
Long crawl and index times or indexing causing sluggishness
Large data volumes require long index times, no dedicated index target.
Block with robots.txt, offload crawling/indexing to dedicated front-end server.
LDAP operations (such as authentication and user operations) causing usage spikes
Low bandwidth, remote domain controller, large profile imports.
Increase bandwidth, use Kerberos, optimize profile importing.
Backup taking too long
Using STSADM, other SQL conditions such as blocking.
Use Microsoft Data Protection Manager (DPM) or SQL 2008 with compression.
IIS out of memory conditions
Application pool and worker process recycling, improper object handling, not enough RAM, poor load balancing architecture.
Use IIS overlapped recycling, use 64-bit hardware.






Additional Resources




Saturday, February 5, 2011

Dump SharePoint Lists Data into SQL Server Tables using SqlBulkCopy

In one of our projects it is required to clean and normalize SharePoint Data and dump that data into SQL server so that any third party can consume that data for reporting purposes. For that we created a timer job that runs every night and does the job....  Here is the approach that we took for implementation


Create a SharePoint Timer job  and add these two references

using System.Data;
using System.Data.SqlClient;



class DatabaseDump : SPJobDefinition
    {
        // No Argument Constructor (required for the system internal operation)
        public DatabaseDump()
        {
        }

        public DatabaseDump(string JobName, SPWebApplication objWebApplication)
            : base(JobName, objWebApplication, null, SPJobLockType.Job)
        {

        }


        public override void Execute(Guid targetInstanceId)
        {

            SPWebApplication currentWebApplication = this.WebApplication;
            try
            {
                using (SPSite currentSite = currentWebApplication.Sites[0])
                {
                    //todo:call export program
                    ExportData(currentSite.Url);
                 }
            }
            catch (Exception ex)
            {
                //exception handling here
            }
        }
    }


Step 2 : Create a function for ExportData

public void ExportData(string SpSiteURL)
        {
            DataTable SourceTable;
            DataTable UserMappingTable;
            DataRow dr;
            DataRow drMapping;
            string tableName;
            string connectionString;
            SPQuery query;
            try
            {
                connectionString = "SQL SERV ER CONNECTION STRING";
                if (!string.IsNullOrEmpty(connectionString))
                {
                    using (SqlConnection connection = new SqlConnection(connectionString))
                    {
                        connection.Open();
                        if (connection != null)
                        {
                            using (SPSite objSite = new SPSite(SpSiteURL))
                            {
                                using (SPWeb currentWeb = objSite.OpenWeb())
                               {
                                    //first of all get the data of all lists in the system
                                    foreach (SPList DFList in currentWeb.Lists)
                                    {
                                        try
                                        {
                                            if (!DFList.Hidden)
                                            {
                                                foreach (SPView view in DFList.Views)
                                                {
//we created a view for the data to be exported but default view can also be used here

                                                    if (view.Title =="Reports View")
                                                    {
                                                        SourceTable = DFList.GetItems(view).GetDataTable();
                                                        if (SourceTable != null)
                                                        {
                                                            //clean table name if required
                                                            tableName = CleanupSQLOBjectName(DFList.Title);
                                                        //drop and create new table first or if you don't want to do this then truncate it first                  
                                       CreateTable(connection, tableName, SourceTable);
                                                            BulkCopy(connection, tableName, SourceTable);
                                                        }
                                                        else
                                                        {
                                                            //no data in this list
                                                            //we need to truncate the existing data for this list now
                                                            tableName = CleanupSQLOBjectName(DFList.Title);
                                                            TruncateTable(connection, tableName);

                                                        }
                                                        break;
                                                    }
                                                }
                                            }
                                        }
                                        catch (Exception ex)
                                        {
                                          //handle exception here
                                        }
                                    }
        }




   private string CleanupSQLOBjectName(String SQLObjectName)
        {
            StringBuilder tempSb = new StringBuilder(SQLObjectName);
            tempSb = tempSb.Replace("/", "_");
            //these are two types of dashes
            tempSb = tempSb.Replace("–", "_");
            tempSb = tempSb.Replace("-", "_");
            tempSb = tempSb.Replace("(", "_");
            tempSb = tempSb.Replace(")", "_");
            tempSb = tempSb.Replace(" ", "_");
            tempSb = tempSb.Replace("_x0020_", "_");
            tempSb = tempSb.Replace("_x002f_", "_");
            tempSb = tempSb.Replace("__", "_");
            if (tempSb.Equals("group"))
            {
                tempSb.Replace("group", "group_1");//"Group" is a reserved word
            }
            return tempSb.ToString();
        }




private void CreateTable(SqlConnection connection, string tableName, string[] columns, string[] dataTypes)
        {
            StringBuilder strBuilder;
        
            SqlCommand cmd;

            //first drop this table
            DropTable(connection, tableName);


            //now we need to create table
            strBuilder = new StringBuilder();
            strBuilder.Append("CREATE TABLE " + tableName + " (");

            for (int i = 0; i < columns.Length; i++)
            {
                strBuilder.Append("[" + columns[i] + "] " + dataTypes[i] + ", ");
            }

            strBuilder.Append(")");


            cmd = new SqlCommand();

            cmd.Connection = connection;
            cmd.CommandType = CommandType.Text;
            cmd.CommandText = strBuilder.ToString();
            cmd.ExecuteNonQuery();
        }





   private void BulkCopy(SqlConnection connection, string tableName, DataTable SourceTable)
        {
            //now copy the data
            using (SqlBulkCopy bulkcopy = new SqlBulkCopy(connection))
            {
                //Set destination table name
                //to table previously created.
                bulkcopy.DestinationTableName = tableName;
                bulkcopy.WriteToServer(SourceTable);
            }
        }


This will update all the data of sharepoint lists in a SQL server database. You can add code to normalize lookup columns, multivalue columns etc and can also extend it to create child tables for lookup data..

Let us know if somebody require any assistance in extending this code.