Friday, March 30, 2012
Regarding how to compress the data file
I have question Regarding how to compress the data file. IN my company, the
data file is getting bigger and bigger. Is that a way to compress the
database file to make it smaller so that we can save some disk space. Thanks.
Iter,
First: Do NOT try to used compressed drives.
http://support.microsoft.com/kb/231347/EN-US/
Use DBCC SHOWCONTIG to determine if your dataspace is badly fragmented or,
in SQL 2005, examine the sys.dm_db_index_physical_stats dynamic management
view. If that is a problem, then reorganizing the database space will help
some. Use:
DBCC DBREINDEX or (SQL 2005) ALTER INDEX ... REBUILD ...
However, the biggest problem is usually retaining more data online than is
actually needed. That would require a policy decision on data retention and
then some development to either archive or delete the expired data. (Of
course, if you keep your archive online, then you have not saved a lot of
disk space, but you may have improved database performance.)
RLF
"Iter" <Iter@.discussions.microsoft.com> wrote in message
news:7786B9FF-FE35-44E6-B92F-8158977FFFFB@.microsoft.com...
> Hi Guys,
> I have question Regarding how to compress the data file. IN my company,
> the
> data file is getting bigger and bigger. Is that a way to compress the
> database file to make it smaller so that we can save some disk space.
> Thanks.
>
|||Hi
"Iter" wrote:
> Hi Guys,
> I have question Regarding how to compress the data file. IN my company, the
> data file is getting bigger and bigger. Is that a way to compress the
> database file to make it smaller so that we can save some disk space. Thanks.
>
I don't think having data files on compressed drives is officially
supported, and compressing the datafile will have a sever impact on
performance.
If you are actually wanting to shrink the database file then, this is not
usually recommended as it can effect performance (see
http://www.karaszi.com/SQLServer/info_dont_shrink.asp) unless you have an
unusually high amount of free space in the database that you are unlikely to
use (such as the growth rate is a percentage rather than a fixed amount).
Look at sp_spaceused to see the unused space. DBCC SHRINKFILE can be used to
shrink the file.
John
sql
Regarding how to compress the data file
I have question Regarding how to compress the data file. IN my company, the
data file is getting bigger and bigger. Is that a way to compress the
database file to make it smaller so that we can save some disk space. Thanks
.Iter,
First: Do NOT try to used compressed drives.
http://support.microsoft.com/kb/231347/EN-US/
Use DBCC SHOWCONTIG to determine if your dataspace is badly fragmented or,
in SQL 2005, examine the sys.dm_db_index_physical_stats dynamic management
view. If that is a problem, then reorganizing the database space will help
some. Use:
DBCC DBREINDEX or (SQL 2005) ALTER INDEX ... REBUILD ...
However, the biggest problem is usually retaining more data online than is
actually needed. That would require a policy decision on data retention and
then some development to either archive or delete the expired data. (Of
course, if you keep your archive online, then you have not saved a lot of
disk space, but you may have improved database performance.)
RLF
"Iter" <Iter@.discussions.microsoft.com> wrote in message
news:7786B9FF-FE35-44E6-B92F-8158977FFFFB@.microsoft.com...
> Hi Guys,
> I have question Regarding how to compress the data file. IN my company,
> the
> data file is getting bigger and bigger. Is that a way to compress the
> database file to make it smaller so that we can save some disk space.
> Thanks.
>|||Hi
"Iter" wrote:
> Hi Guys,
> I have question Regarding how to compress the data file. IN my company, th
e
> data file is getting bigger and bigger. Is that a way to compress the
> database file to make it smaller so that we can save some disk space. Than
ks.
>
I don't think having data files on compressed drives is officially
supported, and compressing the datafile will have a sever impact on
performance.
If you are actually wanting to shrink the database file then, this is not
usually recommended as it can effect performance (see
http://www.karaszi.com/SQLServer/info_dont_shrink.asp) unless you have an
unusually high amount of free space in the database that you are unlikely to
use (such as the growth rate is a percentage rather than a fixed amount).
Look at sp_spaceused to see the unused space. DBCC SHRINKFILE can be used to
shrink the file.
John
Regarding how to compress the data file
I have question Regarding how to compress the data file. IN my company, the
data file is getting bigger and bigger. Is that a way to compress the
database file to make it smaller so that we can save some disk space. Thanks.Iter,
First: Do NOT try to used compressed drives.
http://support.microsoft.com/kb/231347/EN-US/
Use DBCC SHOWCONTIG to determine if your dataspace is badly fragmented or,
in SQL 2005, examine the sys.dm_db_index_physical_stats dynamic management
view. If that is a problem, then reorganizing the database space will help
some. Use:
DBCC DBREINDEX or (SQL 2005) ALTER INDEX ... REBUILD ...
However, the biggest problem is usually retaining more data online than is
actually needed. That would require a policy decision on data retention and
then some development to either archive or delete the expired data. (Of
course, if you keep your archive online, then you have not saved a lot of
disk space, but you may have improved database performance.)
RLF
"Iter" <Iter@.discussions.microsoft.com> wrote in message
news:7786B9FF-FE35-44E6-B92F-8158977FFFFB@.microsoft.com...
> Hi Guys,
> I have question Regarding how to compress the data file. IN my company,
> the
> data file is getting bigger and bigger. Is that a way to compress the
> database file to make it smaller so that we can save some disk space.
> Thanks.
>|||Hi
"Iter" wrote:
> Hi Guys,
> I have question Regarding how to compress the data file. IN my company, the
> data file is getting bigger and bigger. Is that a way to compress the
> database file to make it smaller so that we can save some disk space. Thanks.
>
I don't think having data files on compressed drives is officially
supported, and compressing the datafile will have a sever impact on
performance.
If you are actually wanting to shrink the database file then, this is not
usually recommended as it can effect performance (see
http://www.karaszi.com/SQLServer/info_dont_shrink.asp) unless you have an
unusually high amount of free space in the database that you are unlikely to
use (such as the growth rate is a percentage rather than a fixed amount).
Look at sp_spaceused to see the unused space. DBCC SHRINKFILE can be used to
shrink the file.
John
Regarding DTS
I want to copy the data from Oracle table to excel file in DTS packages.
I have used transformation task, but it is asking for table but excel is a file.
please any body could tell em about this?
Thanks and Regards,
Purushotham,Use connection first.
In connection you will be able to find Excel files or Oracle.
Good Luck.
regarding datamigration using ssis packages
hai,
Currently i am working with data migration using SSIS packages.
i would like to load the data from the flat file to database(siebel).
But i need three extra columns which are not there in flat file to load into the database.
the columns are as follows.
1.ROW_BATCHfixed value(eg:100)
2.ROW_ID-(should be unique just like a sequence)
3.ROW_STATfixed value(string)
The flatfile columns along with the above specified columns should be loaded into the siebel database using SSIS package.
please help me.
You can use a Script-Component in the DataFlow-Task:
1. Drag and configure a Flat-File-Connector to a DataFlow-Task.
2. Drag a Scrip-Component to your DataFlow-Task and select transformation.
3. Connect the Output of your Flat-File-Connector to thr Script-Component.
4. Double-Click the Script-Component and select thr Input and Outputs
5. Add 2 new columns. The new column Fix has the DataType string. The new column ID has the DataTyp [DT_i4].
6 Navigate to the Script tab and Click Design Script.
7. Edit your Script:
Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Pipeline.Wrapper
Imports Microsoft.SqlServer.Dts.Runtime.Wrapper
Public Class ScriptMain
Inherits UserComponent
Dim id As Integer = 0
Public Overrides Sub Input_ProcessInputRow(ByVal Row As InputBuffer)
Row.Fix = "123"
id = id + 1
Row.ID = id
End Sub
End Class
8. Connect the Output of the Script-Component to your destination.
|||thanks loom , thank u very much for your help
Monday, March 26, 2012
reg exporting crystal reports to Excel format
I have created a report using crystal reports and the template has a field that is formatted with (PrintTime) and another field(File Creation Time). When i export this report to .doc file( using VC++ code) I can see the these fields properly. But when i export to .xls file I get ###### in these fields instead of the actual value. These "######" appear only for the time fields and not for the date fields. Also when the cell is formatted to hold a general value , the cell shows a long number.
Can you help as to what would have caused this problem and how i can rectify this?The "problem" you have is not really a problem, it's how Excel was designed to work.
Excel shows "####" when the column isn't wide enough to show the entire contents.
Excel treats dates as long numbers (probably the number of seconds from a certain date) and it's up to the user to make sure the Cell is formatted to display properly.
As for how to fix it, I'm not sure. You can try Crystal's website to see if you can find an article on formatting Excel Cells and Columns.
http://support.businessobjects.com/search/advsearch.asp|||I have increased the cell size and formatted it for showing time.
The ##### still exists. When i change the cell format to general(a long number) is displayed but in the time format these hashes appear|||You may may not be increasing the width of the column enough. In Excel, assuming that you have the date in cell A1, if you double-click the line that separates the column headers "A" and "B", Excel will automatically resize column "A" to fit the width of the longest Cell in the column. I'm not sure if it will help you in Crystal Reports, but in VB I can achieve this with the following line of code:
xlApp.ActiveSheet.Columns("A:A").EntireColumn.AutoFit|||Try printing Printtime and FileCreationDateTIme in different lines. Maybe it's a data issue. I hope there are no special characters in data. Also try displaying Just YYYY from both fields and slowly slowly try adding mm and dd etc..
THanks
reg DTS Package
attachment from one email id weekly.the file name and email id is
same.so i want that file to be automaticaly imported to a table and
updated as per the schedule i have assigned.so wen i get that mail that
pervious table should be droped and new one should be update...can any
body help me in this issue?any ideas how to do?
Regards
samay
Instead of dropping and creating the table (and then loading the data) why
not just truncate and load the table each time?
It is helpful that the file name remains the same. That makes the process a
bit easier.
Do you have to use DTS or can you use BCP or BULK INSERT?
Whichever route you choose, you probably want one standard place for the
location of the files. Perhaps you can create a share on your database
server that you can use as an input area for data files that you need to
import.
Create the table structure.
It is fairly easy to use the Import Data wizard to create and save a DTS
package that will truncate a table and load data from a file.
Do you need any more information or pointers?
Keith Kratochvil
"samay" <sumi_r2@.rediffmail.com> wrote in message
news:1159409542.874173.160470@.b28g2000cwb.googlegr oups.com...
> hi can any one help me in this issue.i use to get one text file as an
> attachment from one email id weekly.the file name and email id is
> same.so i want that file to be automaticaly imported to a table and
> updated as per the schedule i have assigned.so wen i get that mail that
> pervious table should be droped and new one should be update...can any
> body help me in this issue?any ideas how to do?
>
> Regards
> samay
>
reg Bulk Insert
I have a query reg the bulk insert. I have some data in text file with
tab delimetered and using BULK INSERT to get this data into a phycical table
.
i have more columns in the table than in the file. Table is having 10
columns and text file is having only 8 columns.
i am getting the below error ..
'Bulk insert data conversion error (type mismatch) for row 1, column 8 '
Is there any option for Bulk insert to specify about the number columns to
be inserted in the table?
Thanks in advance
BhaskarYou can do this by specifying a FORMATFILE on your BULK INSERT statement.
The format file describes the fields in the text file and which are to be
imported. See 'Using Format Files' the Books Online for more information.
Hope this helps.
Dan Guzman
SQL Server MVP
"Bhaskar" <Bhaskar@.discussions.microsoft.com> wrote in message
news:9AD6481B-C9E5-4676-ADB4-CF00234E944A@.microsoft.com...
> Hi,
> I have a query reg the bulk insert. I have some data in text file with
> tab delimetered and using BULK INSERT to get this data into a phycical
> table.
> i have more columns in the table than in the file. Table is having 10
> columns and text file is having only 8 columns.
> i am getting the below error ..
> 'Bulk insert data conversion error (type mismatch) for row 1, column 8 '
> Is there any option for Bulk insert to specify about the number columns to
> be inserted in the table?
> Thanks in advance
> Bhaskar|||Hi ,
Thanks for the reply. Is there any example with anybody about how to create
the Format file..
Thanks
Bhaskar
"Dan Guzman" wrote:
> You can do this by specifying a FORMATFILE on your BULK INSERT statement.
> The format file describes the fields in the text file and which are to be
> imported. See 'Using Format Files' the Books Online for more information.
> --
> Hope this helps.
> Dan Guzman
> SQL Server MVP
> "Bhaskar" <Bhaskar@.discussions.microsoft.com> wrote in message
> news:9AD6481B-C9E5-4676-ADB4-CF00234E944A@.microsoft.com...
>
>|||The Books Online has examples of format files with fewer fields than the
destination table as well as with more fields.
Hope this helps.
Dan Guzman
SQL Server MVP
"Bhaskar" <Bhaskar@.discussions.microsoft.com> wrote in message
news:93DFC97E-B9D7-461E-BD17-702996CA0F96@.microsoft.com...
> Hi ,
> Thanks for the reply. Is there any example with anybody about how to
> create
> the Format file..
> Thanks
> Bhaskar
> "Dan Guzman" wrote:
>sql
Refreshing a CSV file using DTS
It appears that by default, each time the job runs the contents of the file
are overwritten with the new data from SQL Server.
Can anyone that confirm that this is the case?
And is it possible to have DTS append to rather than overwrite the contents
of the file?
That's correct...the default is to overwrite.
The append option isn't supported with the text file
provider. The workaround is generally to generate a second
file and then use FileSystemObject to create a single file
or a dos copy command to combine the two file into a third
file.
-Sue
On Thu, 24 Feb 2005 07:22:52 -0800, "Dave" <dave@.nospam.ru>
wrote:
>I have a DTS job that populates a CSV file on a regular basis.
>It appears that by default, each time the job runs the contents of the file
>are overwritten with the new data from SQL Server.
>Can anyone that confirm that this is the case?
>And is it possible to have DTS append to rather than overwrite the contents
>of the file?
>
sql
Refreshing a CSV file using DTS
It appears that by default, each time the job runs the contents of the file
are overwritten with the new data from SQL Server.
Can anyone that confirm that this is the case?
And is it possible to have DTS append to rather than overwrite the contents
of the file?That's correct...the default is to overwrite.
The append option isn't supported with the text file
provider. The workaround is generally to generate a second
file and then use FileSystemObject to create a single file
or a dos copy command to combine the two file into a third
file.
-Sue
On Thu, 24 Feb 2005 07:22:52 -0800, "Dave" <dave@.nospam.ru>
wrote:
>I have a DTS job that populates a CSV file on a regular basis.
>It appears that by default, each time the job runs the contents of the file
>are overwritten with the new data from SQL Server.
>Can anyone that confirm that this is the case?
>And is it possible to have DTS append to rather than overwrite the contents
>of the file?
>
Refreshing a CSV file using DTS
It appears that by default, each time the job runs the contents of the file
are overwritten with the new data from SQL Server.
Can anyone that confirm that this is the case?
And is it possible to have DTS append to rather than overwrite the contents
of the file?That's correct...the default is to overwrite.
The append option isn't supported with the text file
provider. The workaround is generally to generate a second
file and then use FileSystemObject to create a single file
or a dos copy command to combine the two file into a third
file.
-Sue
On Thu, 24 Feb 2005 07:22:52 -0800, "Dave" <dave@.nospam.ru>
wrote:
>I have a DTS job that populates a CSV file on a regular basis.
>It appears that by default, each time the job runs the contents of the file
>are overwritten with the new data from SQL Server.
>Can anyone that confirm that this is the case?
>And is it possible to have DTS append to rather than overwrite the contents
>of the file?
>
refresh SSIS pkg to get the change
How do you refresh a SSIS pkg to get the latest table schema change?
I have this data flow task that will load data from a flat file into a table.
I got a Warning: Truncation may occur due to retrieving data from database column "txtSNumber" with a length of 50 to data flow column "txtSNumber" with a length of 20.
Then I went into Management Studio and changed the column size.
Now in my dev BID I got next:
[SQL Server 05[82]] Warning: The external metadata column collection is out of synchronization with the data source columns. The column "txt..." needs to be updated in the external metadata column collection.
I think this means my change on the table did not get into my ssis, and I could not find a way to refresh.
Thanks!It actually means the opposite. The change was detected by SSIS and it needs to update the metadata associated with the table in the data flow task (i.e. change the column width from 20 to 50). It does not automatically do this without some user interaction (hitting "OK") since you may need to know about the change and correct it if it is incorrect.|||
Thanks for the reply.
I am not looking for the automatic way to refresh if there is one.
I want to manually start the process in my design environment and click the OK as you have suggested, but I could not find out how or from where to make it happen.
I have tried right click on my .dtsx and select reload with upgrade, but that is not it. Where can tap into the megadata?
|||Look at the components in your data flow that have the yellow triangle with an exclamation mark in it. Start there by double clicking on that component to refresh the metadata.|||Double click on the yellow mark is the trick! Thanks!Friday, March 23, 2012
Refresh data from production to development
production env to development env.
what i got is a backup file of a db in the prod env, i now need to
make that into the development env.
I can restore it to the dev env no problem, but the warnings i got are
that the tables owner in prod and dev env need to be different, that
is, owner is A in prod env and owner is B in dev env.
So I need to:
1) restore the db in dev env
2) change table owner from A to B
3) change related triggers
4) change related views
Can anyone suggest me an approach that is most efficient?
Thanks a lot.Hiya Bosco,
I'd create a script that dynamically builds a "sp_changeobjectowner"
statement for all objects that user A owns.. It'll simlify the steps
to this
1. Restore db in dev
2. Create and run permissions script
Syntax is as follows:
EXEC sp_changeobjectowner '<table_name>', 'B'|||Bosco
Here are some aricles that should help you.
http://www.support.microsoft.com/?id=314546 Moving DB's between Servers
http://www.support.microsoft.com/?id=224071 Moving SQL Server Databases
to a New Location with Detach/Attach
http://support.microsoft.com/?id=221465 Using WITH MOVE in a Restore
http://www.support.microsoft.com/?id=246133 How To Transfer Logins and
Passwords Between SQL Servers
http://www.support.microsoft.com/?id=298897 Mapping Logins & SIDs after
a Restore
http://www.dbmaint.com/SyncSqlLogins.asp Utility to map logins to users
http://www.support.microsoft.com/?id=168001 User Logon and/or
Permission Errors After Restoring Dump
http://www.support.microsoft.com/?id=240872 How to Resolve Permission
Issues When a Database Is Moved Between SQL Servers
Hope this helps
John
Refining MS ACCESS data to csv file using T-SQL
Hello,
I have a table with these sample data pattern:
Col1 Col2
001 2
002 1
002 2
002 1
003 4
003 4
003 3
003 3
003 5
The expected output in the csv file should be exactly like this:
001,2
002,4
002,
002,
003,19
003,
003,
003,
003,
In the csv file, the number next to the topmost 002 is the sum of the Col2 where Col1
is 002 and the remaining field next to other 002 rows will be blank, the same process repeats for 003 and so on.
Would anyone please help me to make a T-SQL statement for the above requirement? I am using VB codes.
Thanks
k
1. Add an AutoNumber field named as ID to your table;
2. Create the query in Access:
SELECT t1.Col1, t2.sumCol2
FROM yourTable AS t1 LEFT JOIN [SELECT Col1, SUM(Col2) as sumCol2, min(ID) as minID
FROM yourTable Group By Col1]. AS t2 ON (t1.ID=t2.minID) AND (t1.Col1=t2.Col1);
3. Export this query to a csv file from Access.
|||Thank you very much limno. Your T-SQL code perfectly solved my problem.
k
sql
Monday, March 12, 2012
Referenced Memory Problem with Enterprise Manager
the following error when I click on the button to select MDF file of
database to attach:
The instruction at "0x41d53c0a" referenced memory at "0xffffffff". The
memory could not be read.
I've reinstalled SQL SP3a and Windows 2003 patches. Is this error a result
of a memory hardware problem or a conflict with another installed
application?
How can I resolve the problem since I am unable to use the manager for this
function?
thanks.Could me several issues. Could be a bad SQL Server binary. Mismatched
MDAC. Bad MMC build. Or, your worst fears, the .MDF file may be locked or
corrupt.
I'd try to attach to another SS server and/or use Query Analyzer and
sp_attach_db to narrow down if it is the .MDF file the local SS installation
or just the SQL EM tool.
Sincerely,
Anthony Thomas
"Dan Slaby" <dslaby3@.comcast.net> wrote in message
news:eq6sX9WAFHA.2180@.TK2MSFTNGP10.phx.gbl...
When I attempt to attach a database using the wizard in the manager, I get
the following error when I click on the button to select MDF file of
database to attach:
The instruction at "0x41d53c0a" referenced memory at "0xffffffff". The
memory could not be read.
I've reinstalled SQL SP3a and Windows 2003 patches. Is this error a result
of a memory hardware problem or a conflict with another installed
application?
How can I resolve the problem since I am unable to use the manager for this
function?
thanks.|||I resolved the problem by making another install of SP3a.
"Dan Slaby" <dslaby3@.comcast.net> wrote in message
news:eq6sX9WAFHA.2180@.TK2MSFTNGP10.phx.gbl...
> When I attempt to attach a database using the wizard in the manager, I get
> the following error when I click on the button to select MDF file of
> database to attach:
> The instruction at "0x41d53c0a" referenced memory at "0xffffffff". The
> memory could not be read.
> I've reinstalled SQL SP3a and Windows 2003 patches. Is this error a result
> of a memory hardware problem or a conflict with another installed
> application?
> How can I resolve the problem since I am unable to use the manager for
> this function?
> thanks.
>
>
Friday, March 9, 2012
Reference for Security Nazis
folks at my work that outlines what file extensions, ports, and dll's sql
server uses? They've gone hog wild with 'security' software here to the
point that they invariably end up shutting down one behavior or another
within SQL each time they do a 'security upgrade'. Grrrr.
TWSince no one replied, and for the sake of the group, I'll post what I've
come up *so far* as far as necessary ports, file extensions, etc that SQL
needs ...
PORTS:
1433
PROTOCOLS:
smtp
FILE EXTENSIONS:
..mdf
..ldf
..bak
..sql
..csv
..rpt
...I'll update as I come up with more
--TW
"Tech Witch" <tech.witch@.gmail.NOSPAM.com> wrote in message
news:Q_Yzd.6775$iC4.5546@.newssvr30.news.prodigy.co m...
> Does anyone know of a quick reference I could provide to the it security
> folks at my work that outlines what file extensions, ports, and dll's sql
> server uses? They've gone hog wild with 'security' software here to the
> point that they invariably end up shutting down one behavior or another
> within SQL each time they do a 'security upgrade'. Grrrr.
> TW|||Hi
In some ways your security guys should be telling you what they want to do
and what effect it should have. You should change from the standard ports,
SMTP is not needed as a default. Not sure why file extensions are needed,
you should avoid virus checking the database files, None of the extension
you mention contain executable code, restricted access to programs and
directories should be implemented, and non-essential ones removed.
The is alot of information on http://www.sqlsecurity.com
http://www.sqlsecurity.com/DesktopDefault.aspx?tabid=22
http://www.sqlsecurity.com/DesktopDefault.aspx?tabid=24
http://www.microsoft.com/security/g.../SQLServer.mspx
http://www.microsoft.com/technet/pr...n/sp3sec04.mspx
http://www.microsoft.com/technet/se...s/mbsahome.mspx
http://www.microsoft.com/downloads/...&displaylang=en
You should be working with the security guys, they may have already saved
your skin.
John
"Tech Witch" <tech.witch@.gmail.NOSPAM.com> wrote in message
news:SkCAd.4116$F67.400@.newssvr12.news.prodigy.com ...
> Since no one replied, and for the sake of the group, I'll post what I've
> come up *so far* as far as necessary ports, file extensions, etc that SQL
> needs ...
> PORTS:
> 1433
> PROTOCOLS:
> smtp
> FILE EXTENSIONS:
> .mdf
> .ldf
> .bak
> .sql
> .csv
> .rpt
> ...I'll update as I come up with more
> --TW
> "Tech Witch" <tech.witch@.gmail.NOSPAM.com> wrote in message
> news:Q_Yzd.6775$iC4.5546@.newssvr30.news.prodigy.co m...
>> Does anyone know of a quick reference I could provide to the it security
>> folks at my work that outlines what file extensions, ports, and dll's sql
>> server uses? They've gone hog wild with 'security' software here to the
>> point that they invariably end up shutting down one behavior or another
>> within SQL each time they do a 'security upgrade'. Grrrr.
>>
>> TW
>>
>>|||Thanks, John. I will check these links out. To give you a better idea of
what I'm up against, we had several incidents where our security folks
applied a series of patches that caused some undesirable effects like using
a resource-intensive application to perform virus scanning on files with
..BAK, .MDF, and .LDF extensions each time the files were written (some were
in the GIGs), then they disabled our SMTP ports on the servers with some
firewall software, causing our SQL alerts to stop being sent to DBA's (we
don't use SQL mail here). I'm getting tired of them not telling us what
they are changing and not testing the results of said actions, so I want to
come up with a list of necessary SQL functions & files for them to refer to
next time. Thanks again for your suggestions. I'll post an updated list
when I'm done.
TW
"John Bell" <jbellnewsposts@.hotmail.com> wrote in message
news:41d3238e$0$5263$afc38c87@.news.easynet.co.uk.. .
> In some ways your security guys should be telling you what they want to do
> and what effect it should have. You should change from the standard ports,
> SMTP is not needed as a default. Not sure why file extensions are needed,
> you should avoid virus checking the database files, None of the extension
> you mention contain executable code, restricted access to programs and
> directories should be implemented, and non-essential ones removed.
> <snip>
> You should be working with the security guys, they may have already saved
> your skin.
Reference a flat file (as a table) in a sql statemtent
select * into 'c:\\out.txt' from SomeTable
Does anyone know if this functionality exists? If so, what is the syntax?
TIAYou might be able to get what you are looking for using a text file linked server using sp_addlinkedserver (http://msdn.microsoft.com/library/default.asp?url=/library/en-us/tsqlref/ts_sp_adda_8gqa.asp) (it is example H). You might also get what you want using OSQL.EXE (http://msdn.microsoft.com/library/en-us/coprompt/cp_osql_1wxl.asp) too.
-PatP|||Originally posted by Pat Phelan
You might be able to get what you are looking for using a text file linked server using sp_addlinkedserver (http://msdn.microsoft.com/library/default.asp?url=/library/en-us/tsqlref/ts_sp_adda_8gqa.asp) (it is example H). You might also get what you want using OSQL.EXE (http://msdn.microsoft.com/library/en-us/coprompt/cp_osql_1wxl.asp) too.
-PatP
Set up a linked server, then use SELECT * FROM OPENQUERY(linked_server,file) or whatever else you want to do.|||I feel OSQL is better tool to use than Linked server.|||Originally posted by Satya
I feel OSQL is better tool to use than Linked server.
Why? And can you give the guy an example of how this would work for him, including how he'll set up the needed datasource.|||Thanks for the replies. It doesn't look like it is possible to use the method I thought I'd remembered. Your way is more than sufficient. Thanks.|||ummmm...bcp with queryout?|||Queryout? I did use bulk copy in a query to import a fixed width flat file using a format file, but I don't know what queryout is...|||Well I thought you're talking about outbound...
USE Northwind
GO
EXEC master..xp_cmdshell 'bcp "SELECT * FROM Orders" queryout c:\tax\Test.txt -S\servername\instance -U -P -c'
Reference a file name
How can I reference the ldf or mdf (using the logical names) file of a
particular database when I'm in a diferent context, for instance, I've got
two databases PG and PG1 and being in PG1 I want to do the following
DBCC SHRINKFILE (PG_Log ,100)
Any ideas?One option,
EXEC('Use PG DBCC SHRINKFILE (PG_Log ,100)')
-Sue
On Thu, 27 Apr 2006 17:35:01 -0700, FJC
<FJC@.discussions.microsoft.com> wrote:
>Hi,
>How can I reference the ldf or mdf (using the logical names) file of a
>particular database when I'm in a diferent context, for instance, I've got
>two databases PG and PG1 and being in PG1 I want to do the following
>DBCC SHRINKFILE (PG_Log ,100)
>Any ideas?|||Thanks Sue, I worked.
"Sue Hoegemeier" wrote:
> One option,
> EXEC('Use PG DBCC SHRINKFILE (PG_Log ,100)')
> -Sue
> On Thu, 27 Apr 2006 17:35:01 -0700, FJC
> <FJC@.discussions.microsoft.com> wrote:
>
>
Reference a file name
How can I reference the ldf or mdf (using the logical names) file of a
particular database when I'm in a diferent context, for instance, I've got
two databases PG and PG1 and being in PG1 I want to do the following
DBCC SHRINKFILE (PG_Log ,100)
Any ideas?One option,
EXEC('Use PG DBCC SHRINKFILE (PG_Log ,100)')
-Sue
On Thu, 27 Apr 2006 17:35:01 -0700, FJC
<FJC@.discussions.microsoft.com> wrote:
>Hi,
>How can I reference the ldf or mdf (using the logical names) file of a
>particular database when I'm in a diferent context, for instance, I've got
>two databases PG and PG1 and being in PG1 I want to do the following
>DBCC SHRINKFILE (PG_Log ,100)
>Any ideas?|||Thanks Sue, I worked.
"Sue Hoegemeier" wrote:
> One option,
> EXEC('Use PG DBCC SHRINKFILE (PG_Log ,100)')
> -Sue
> On Thu, 27 Apr 2006 17:35:01 -0700, FJC
> <FJC@.discussions.microsoft.com> wrote:
> >Hi,
> >How can I reference the ldf or mdf (using the logical names) file of a
> >particular database when I'm in a diferent context, for instance, I've got
> >two databases PG and PG1 and being in PG1 I want to do the following
> >
> >DBCC SHRINKFILE (PG_Log ,100)
> >
> >Any ideas?
>
Wednesday, March 7, 2012
Reducing the size of the transaction log file
KamenHi,
I had the same issue yesterday, i had to, Backup the log file then use shrink, backup one moore time and shrink.|||1. Check your database 'Recovery Model', set it according to your requirement.
2. Make a backup plan matching your data sensitivity & requirement as well as schedule it.
3. Plan Log backup along with Log Truncate option.
Refer Books OnLine from SQL Query Analyzer.|||Thank you very much!
Kamen