Hottest Forum Q&A on CodeGuru from the week of March 14th, 2004

Lots of hot topics are covered in the Discussion Forums on CodeGuru. If you missed the forums this week, you missed some interesting ways to solve a problem. Some of the hot topics this week include:


How do I use a progress bar with large files? (top)

Thread:

jkw2068 is using a progress bar while copying large files. Unfortunately, the progress bar still does not work properly.

Up till now I have been using the
m_progress.SetPos(m_progress.GetPos() + 60);
respectively. The only problem occurs is when a large file is being
copied over, say 8-10MB. The bar just hangs until the file is copied
over and then adjusts to my calculation of completness.

How can I configure a progress bar to realtime?

Here is the code that jkw2068 is using:

void CCopyDlg::CopyProgress(void)
{
   m_progress.SetPos(m_progress.GetPos() + 30);
   int copy = CopyFile(m_strFILELOCATION,"A:\\"
            + m_strFILENAME,FALSE);    //Copying file

   if(copy > 0){
      m_progress.SetPos(m_progress.GetPos() + 60); 
      CStdioFile csf;
      csf.Open("a:\\autoexec.bat", CFile::modeWrite |
                                   CFile::modeCreate |
                                   CFile::typeText);

      //Need to handle on error for file creation/write.

      m_progress.SetPos(m_progress.GetPos() + 90);
      csf.WriteString(m_strFILENAME);
      csf.Close();

      m_progress.SetPos(m_progress.GetPos() + 100);
      MessageBox("Disk is ready for use", "Done", MB_OK |
                 MB_ICONINFORMATION);

      OnCancel();
   }
   else{
      m_progress.SetPos(m_progress.GetPos() + 100);
      MessageBox("Unable to copy the file " + m_strFILENAME,
                 "ERROR", MB_OK | MB_ICONERROR);
      OnCancel();
   }
}

Well, this is a simple function of your second dialog. And, as long as you are in that function, the message queue of that dialog will not be processed. In other words, your progress bar will not be updated. So, the best technique would be to update the progress bar by itself by using the UpdateWindow(); function. Thanks to Andreas Masur for providing the neccesary code:

void CCopyDlg::CopyProgress(void)
{
   m_progress.SetPos(m_progress.GetPos() + 30);
   m_progress.UpdateWindow();    // update the progress bar

   int copy = CopyFile(m_strFILELOCATION,"A:\\"
            + m_strFILENAME,FALSE);    //Copying file
   if(copy > 0){
      m_progress.SetPos(m_progress.GetPos() + 60);
      m_progress.UpdateWindow();    // update the progress bar

      CStdioFile csf;
      csf.Open("a:\\autoexec.bat", CFile::modeWrite |
                                   CFile::modeCreate |
                                   CFile::typeText);

      // Need to handle on error for file creation\write.

      m_progress.SetPos(m_progress.GetPos() + 90);
      m_progress.UpdateWindow();    // update the progress bar

      csf.WriteString(m_strFILENAME);
      csf.Close();

      m_progress.SetPos(m_progress.GetPos() + 100);
      m_progress.UpdateWindow();    // update the progress bar

      MessageBox("Disk is ready for use", "Done", MB_OK |
                 MB_ICONINFORMATION);
   }
   else{
      m_progress.SetPos(m_progress.GetPos() + 100);
      m_progress.UpdateWindow();    // update the progress bar

      MessageBox("Unable to copy the file " + m_strFILENAME,
                 "ERROR", MB_OK | MB_ICONERROR);
   }
}


Is it possible to open a file with shellexuecute without knowing the document type? (top)

Thread:

joleary wants to open a file with Shellexecute/ShellExecuteEx or CreateProcess. Unfortunately, he does not know the extension of that file. Is it still possible open it?

I have a filename provided by a user.
All I know for sure is that the file exists.

I want to open it with whatever application is associated with
this file and wait until that application is closed before
proceeding.

If ShellExecute/Ex (or whatever I end up using) cannot find an
association, I tell the user that I can't open the file and carry on.

If I am able to open the file, I want to wait until that application
(whatever it is) is closed before continuing with my processing.

So, I've learned how ShellExecute/Ex can execute the associated
application for me, but I do not understand how to wait for
whatever application that was actually executed to finish
because I don't know how to get a (guaranteed) viable process
handle from ShellExecute/Ex.

And I don't know how to make use of CreateProcess in this situation
to get that viable handle.

Is there another API I can use, to the Registry or somewhere, to
tell me, up front, what the file association is so I can use
CreateProcess?

You can use CreateProcess, but you will need to find the associated application for the specific extension on your own. CreateProcess() will not do it on its own. You can use the function FindExecuteable(), which will help you to do the lookup.


Why does std::numeric_limits<double>::max() not work? (top)

Thread:

ephemera needs to work with std::numeric_limits<double>::max(), but unfortunately he gets an error although it seems that everything is correct. What can cause such an error? Take a look.

I used the std::numeric_limits<double>::max();, but Visual C++ 6.0
reported the error C2589:
'(' : illegal token on right side of '::'
G:\MY program\temp\test1\test1.cpp(12) : error C2143: 
syntax error : missing ';' before '::'
Anyone know whether Visual C++ 6.0 supports std::numeric_limits?

Well, it seems that he forgot to specify the namespace using namespace std;. But he already did that. The problem is that max is already defined as a macro. Enter the following lines before std::numeric_limits<double>::max().

#ifdef max
#undef max
#endif


Is the C Array style faster than std::vector? (top)

Thread:

etaoin raised a very interesting question. Take a look:

I hope that the stl gurus can help me with this...
Consider the following: In my current project, a team member
implemented code such as the following:
int*    m_timeArray;
string* m_nameArray;
double* m_valueArray;

// Dynamically allocate space for the arrays here...

for(int i=0; i < numElements; i++)
{
  m_timeArray[i]  = someTime;
  m_nameArray[i]  = someName;
  m_valueArray[i] = someValue;
}
Because these C-style arrays are actually representing an array of
objects (each consisting of a time, a name. and a value, in this
simplified example), I suggested to:

  • create a class which has time, name and value as members, and
  • use a std::vector instead of a C array.

Something along the lines of:

class Element
{
public:
  Element(int time, const char* pName, double value);
  virtual ~Element();

private:
  int    m_time;
  string m_name;
  double m_value;
};
vector<Element> elements;

Well, this is just a part of the question of the OP. The explaination is much longer and I am afraid that I can not display the whole question here. So in summary: Using push_back to add elements to the vector will create a temporary object and therefore effectively copy the data twice and the performance would be more costly than the C code.

In general, using any of the STL classes does not mean necessarily that the resulting code will perform slower than the old one. And, even with 'vector', it is most of the time the other way round. One common mistake is to actually compare debug versions instead of release ones.

Furthermore, if e.g. a 'vector' performs much slower in a release build, it is most-likely due to the fact that the implementation (how the 'vector' is being used, not the 'vector' implementation itself) is 'bad.' For example, in your case, you should already reserve enough memory (-> 'numElements') for the 'vector' up-front, so that no additional memory allocation is necessary while doing a 'push_back()'. Take a look at the whole thread to know more about that topic. Besides that, take a look at the following article.



About the Author

Sonu Kapoor

Sonu Kapoor is an ASP.NET MVP and MCAD. He is the owner of the popular .net website http://dotnetslackers.com. DotNetSlackers publishs the latest .net news and articles - it contains forums and blogs as well. His blog can be seen at: http://dotnetslackers.com/community/blogs/sonukapoor/

Comments

  • There are no comments yet. Be the first to comment!

Leave a Comment
  • Your email address will not be published. All fields are required.

Top White Papers and Webcasts

  • Live Event Date: December 11, 2014 @ 1:00 p.m. ET / 10:00 a.m. PT Market pressures to move more quickly and develop innovative applications are forcing organizations to rethink how they develop and release applications. The combination of public clouds and physical back-end infrastructures are a means to get applications out faster. However, these hybrid solutions complicate DevOps adoption, with application delivery pipelines that span across complex hybrid cloud and non-cloud environments. Check out this …

  • With the average hard drive now averaging one terabyte in size, the fallout from the explosion of user-created data has become an overwhelming volume of potential evidence that law-enforcement and corporate investigators spend countless hours examining. Join Us and SANS' Rob Lee for our 45-minute webinar, A Triage and Collection Strategy for Time-Sensitive Investigations, will demonstrate how to: Identify the folders and files that often contain key insights Reduce the time spent sifting through content by …

Most Popular Programming Stories

More for Developers

RSS Feeds